Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Programming the Commodore 64: the Definitive Guide

timothy posted more than 4 years ago | from the you-kids-keep-off-my-xeriscaping dept.

Programming 245

Mirk writes "Back in 1985 it was possible to understand the whole computer, from the hardware up through device drivers and the kernel through to the high-level language that came burned into the ROMs (even if it was only Microsoft BASIC). The Reinvigorated Programmer revisits R. C. West's classic and exhaustive book Programming the Commodore 64 and laments the decline of that sort of comprehensive Deep Knowing."

Sorry! There are no comments related to the filter you selected.

"Tape Storage" image turns to goatse? (0)

Anonymous Coward | more than 4 years ago | (#31467910)

I was studying the "Tape Storage" image in that article, and after about three or four minutes it switched to goatse. :( Beware!

Re:"Tape Storage" image turns to goatse? (1)

DarrenBaker (322210) | more than 4 years ago | (#31468186)

The lady doth protest too much, methinks.

Re:"Tape Storage" image turns to goatse? (0)

Anonymous Coward | more than 4 years ago | (#31468344)

Steve Jobs, is that you?

Frist psot! (3, Funny)

heffel (83440) | more than 4 years ago | (#31467912)

Atari 800 rules!

Re:Frist psot! (3, Funny)

ProfM (91314) | more than 4 years ago | (#31468028)

Atari 800 rules!

Here we go again ... "which platform is better" flamewar. Everybody KNOWS that the Sinclair ZX80 was best.

Re:Frist psot! (3, Insightful)

Nimey (114278) | more than 4 years ago | (#31468214)

The Apple ][, clearly.

Re:Frist psot! (0)

Anonymous Coward | more than 4 years ago | (#31468710)

You know nothing of the Jay Miner magic that was the Atari 8 Bit.

Re:Frist psot! (1, Informative)

Anonymous Coward | more than 4 years ago | (#31468612)

But it took the XE series to get the BASIC right so that bugs were removed and a shorthand parser added to save a bit of typing. The XLs still needed cartridges to get the better version. (I think the Commodore folks mostly missed out on that kind of thing.) And by the time Atari had the 8-bit formula mostly right, all the attention and support moved to the 16-bit ST computers. (Commodore fans did share in this kind of thing with the transition to Amiga. Yet they still had a bigger 8-bit user base to fall back on.)

Strangely enough, sometimes you also needed an interpreter software to run XL software on an XE. I think that was necessary because some programmers found about the few XL bugs and considered them a feature. We had (have?) our rivalry, but I feel Commodore folks missed out on some of the fun.

Indeed (1)

2phar (137027) | more than 4 years ago | (#31467914)

Oh what fun raster interrupts were.

Re:Indeed (5, Interesting)

davester666 (731373) | more than 4 years ago | (#31468172)

There doesn't appear to be any section on custom high-speed communication with the external floppy drive unit. IIRC, you could upload a small program to the drive, and then you could in particular read data from the drive a lot faster than the 'OS' normally supported. This technique was also used to do copy protection for a bunch of titles, primarily by stepping the drive head 1/2 between tracks then doing reads. Production disk duplication could write to both the track & between tracks [or could write a wide enough track to cover the whole area], but regular floppy drives couldn't write both [you could either write on the track, or between tracks].

Not that I was interested in this stuff or anything.

How Fast Loaders Worked (4, Interesting)

headkase (533448) | more than 4 years ago | (#31468490)

Fast loaders worked because the kernel ROM software didn't fully take advantage of the hardware. Between the C64 and the 1541 floppy drive the connector cable had 4 wires for carrying information. The kernel routines built into ROM only used one of those lines to signal from the drive to the computer. The "fast loaders" simply uploaded a program to the drive which used all four lines to signal information. The "fast" loaders weren't fast magic they just removed a deficiency in the kernel ROM routines. The exact number of lines between the computer and drive I'm not sure of but this is the principle the fast loaders worked by. And tape based fast loaders worked because the kernel routines would save a copy of the information to tape and then immediately save a complete other copy to compare against for error correction on load. The tape fast loaders just skipped saving and comparing the redundant copy to get the speed. Disk fast loaders didn't compromise the integrity of the information in the way tape fast loaders had potential to though. Remember computers back then were full of noise when you were talking to tape drives especially.

Re:How Fast Loaders Worked (1)

eulernet (1132389) | more than 4 years ago | (#31468696)

A disk-drive fast loader could fit into 256 bytes, IIRC.
The trick was to shift bits in an unrolled loop. I remember I stored some values on the stack by using PHA in this loop.

I did write a fast loader for a few games I wrote on the C64 (with was one of the computers I was doing games in the 80s, along with Oric, Thomson TO7 and MO6 and Amstrad CPC). Atari ST appeared in 1987.

Re:How Fast Loaders Worked (2, Interesting)

mindstrm (20013) | more than 4 years ago | (#31468760)

The 1541 drives were purposefully slowed down to maintain compatibility with some other Commodore hardware (I forget at the moment exactly which)..... so they weren't so much fast-loaders as they were "doing it the way the engineers designed it to work, not the way the boss made them change it so he could claim compatibility.

Re:Indeed (1)

mrmeval (662166) | more than 4 years ago | (#31468526)

That's in an exhaustive 1541 ROM dis-assembly book. I do not now recall who published or wrote it. It had exhaustive detail on how the drive's rom worked. Using that and some machine code I actually found out you could store a file on the floppy that on power on would boot no-knock code. I used that for several years. Then tech got more difficult and less accessible for a time.

Now with small devices like the Arduino and Make Magazine's controllers many people are learning and better creating things with reasonably easy to understand tech.

Re:Indeed (1)

mrmeval (662166) | more than 4 years ago | (#31468652)

It's been so long I cannot recall if the code was in that book or if I used this portion of it to create the program. []

I can't even find the program that ran on the C64 to stop the knock on the 1541.

Sweet! (4, Funny)

Junior J. Junior III (192702) | more than 4 years ago | (#31467922)

Now, I can finally stop waiting and get to programming my Commodore 64!

Re:Sweet! (4, Funny)

ae1294 (1547521) | more than 4 years ago | (#31468148)

Now, I can finally stop waiting and get to programming my Commodore 64!

I know right! I finally got my Beowulf Cluster of c128's running GNU/linux doing seti@home work!!! Sure it's drawing about 200amps but damn it's a sweet setup and only takes about 24 days per unit!

6510 Machine Language (3, Interesting)

headkase (533448) | more than 4 years ago | (#31467930)

I started on a Commodore 64 (well a Commodore 128 that ran exclusively in 64 mode..) and learned machine language by breaking protections of the day. Many of the things that were legal back then such as copying software for DRM'd games are now gone the way of the dodo. I honestly see that in twenty years from now a debugger in itself will be seen as a "tool of crime" or whatever wordage they use to keep them out of the general public's hands just like lock-picks today. Hope you like high-level because the day is coming that it will be illegal to be low-level without a government (or more likely Apple) license.

You needed a debugger? (0)

perpenso (1613749) | more than 4 years ago | (#31468016)

You needed a debugger? ;-) We programmed various commercial games and education programs without a debugger of any kind on the C64. Matter of fact the C64 assembler and other tools were so bad we actually did our development on the Apple II and downloaded the resulting binaries to the C64 for execution. Debugging consisted of printing our own hex digits on the screen or maybe some color coded pixels.

Perpenso Calc [] for iPhone and iPod touch, scientific and bill/tip calculator, fractions, complex numbers, RPN

Re:You needed a debugger? (0)

Anonymous Coward | more than 4 years ago | (#31468554)

Completely agree.

I wrote 27 games in the 1980s, most of which were for 6502/6510 variants (C64, C16, Atari 400/800 Acorn Electron/BBC B).

I used to debug by writing specific characters to the top left location of the screen.

For Atari development I made a parallel to joystick port cable and wrote software on the C64 (in asm) and on the Atari (in basic which wrote an asm program to configure two of the four joystick ports as inputs so that I could effectively read bytes from two joystick inputs which where connected to the parallel output on the C64). Each time I had a new compile to try out (because we didn't think in terms of builds back then - no DLLs, just one executable), I had to type the basic loader into the Atari, run it and if it loaded, it ran.

Seem to remember we had a similar setup for the Electron/BBC but I can't remember as much about it (except that the frame rate on the Electron had to be twice as slow (and move twice as far) as the Beeb or any other variant because the chip that ran the graphics on thw Electron was so slow).

C64, best (8 bit) home computer of the 1980s, by a large margin. Way better than a Sinclair Spectrum. No comparison, except in the minds of Spectrum owners. Only truly validated in hindsight.

Games like "Sheep in Space", "Revenge of the Mutant Camels", "Ancipital" (all Jeff Minter classics, great guy, sadly lost touch with him) could never be done on a Spectrum.

Mind you, "Hellgate" (also by Jeff) worked fantastically on the cramped 20x22 screen of a Vic-20 but was rubbish on the larger Commodore 64 screen.

Relax (3, Funny)

sakdoctor (1087155) | more than 4 years ago | (#31468032)

Relax. You've obviously read too many kdawson stories recently, and have been trolled into a heightened state of paranoia. Don't worry, it happens to the best of us.

Also, why have you switched off your iphone citizen 533448?

Re:Relax (5, Interesting)

Anonymous Coward | more than 4 years ago | (#31468180)

I was playing a game with some DRM (either StarForce or SecuROM) and it wouldn't run if I had a debugger present. I asked them why and they were all like "Anyone who has a debugger and is playing the game is a hacker." That's RIGHTLY earned state of paranoia.

Re:Relax (1)

NervousNerd (1190935) | more than 4 years ago | (#31468360)

I believe that it's StarForce [] .

Re:Relax (1)

Kvasio (127200) | more than 4 years ago | (#31468366)

Also, why have you switched off your iphone citizen 533448?

And you have not called your mother for 8 days now, citizen 29821274!

Re:Relax (3, Interesting)

fuzzyfuzzyfungus (1223518) | more than 4 years ago | (#31468600)

Given that German has already gone and adopted an absurdly vague and overbroad law aimed at "hacking tools" [] , I wouldn't really describe somebody hypothesizing that other jurisdictions might do so in the future as "paranoid".

Perhaps ultimately more dangerous(because they tend to be subtler) are situations where no law ever bans something, per se; but some quiet mixture of contractual, legal, and technical pressure effectively prevents it anyway. Consider SDI [] for an instance of that. A digital video transmission standard, available well in advance of HDMI, that was frozen out of the "Consumer" market entirely. It's not like possession was illegal or anything; but most people never even heard of it, nor was it available on any broadly affordable hardware.

In the case of something like debuggers, I'd be very surprised to see any sort of legal ban; but the technological/private sector contractual de facto neutralization is an eminently plausible scenario. Already, in recent versions of Windows, any media application that requires the "Protected Video Path" will throw a fit if there are any unsigned drivers loaded that could compromise that path. An analogous "Protected Execution Path", provided by the OS for programs that didn't want anybody else debugging them or looking at their memory, hardly seems implausible. Not to mention, of course, the increasing percentage of consumer-level computer activity that is occurring on devices were being able to run arbitrary programs isn't even an expectation. Not much debugging going on on Xbox360s, and debuggers don't have to be illegal to not be available through the App Store.

There will always be gaps, of course, for the sufficiently knowledgeable, motivated, and well equipped; but a largely opaque consumer level computing environment seems like an unpleasantly plausible prediction.

V-Max (5, Interesting)

headkase (533448) | more than 4 years ago | (#31468092)

One of the most comprehensive protections at that time was called "V-Max!" which stood for Verify Maximum. What were called "nibblers" for disc copy software couldn't touch it even though those nibblers represented the ultimate in disk copy technology at the time. There were two ways to copy V-Max, the first was to get a dedicated hardware copying unit. The second was to apply a bit of knowledge with a debugger cartridge: the V-Max protection was a turn-key system you gave them files and they wrapped the protection around it and provided a fast-loader at the same time. So what you would do is fill all of memory (the whole 64K) with a value you knew say: $AF. Then you would load a V-Max file from the disc, it's loader would automatically take over and while it was loading you would enter your debugger cartridge and change it's exit point to point to itself. So instead of $0800: RTS you would make it $0800: JMP $0800. Then you would wait for the V-Max loader to fully load the file. Then a quick button press on your debugger cartridge and use the memory monitor to find where the file loaded by seeing what memory was NOT $AF. Then from the debugger cartridge save that memory block out again. Completely de-protected file. Since V-Max used standard kernel-load vectors the program itself needed no further modification, the protection was completely gone you just lost the fast-loader function. Which you then re-added yourself into a chunk of memory wherever the game didn't use it. Relocatable code was best for that. Later versions of V-Max also did on-the-fly decompression of files so occasionally while stripping the protection you would run into a situation where your destination disk ran out of space versus the original protected disk. Again, that was worked around by inserting your own custom loader into the kernel load-vectors which also did decompression. V-Max was impossible for copy software of the day to copy but with a little bit of knowledge and a debugger cartridge it was absolutely trivial to defeat.

Uphill Both Ways (5, Interesting)

headkase (533448) | more than 4 years ago | (#31468204)

And why did I spend time removing protection systems? Funny that part is: I owned an MSD floppy drive which was completely incompatible at a machine-language level with the 1541 drives everyone else owned and that all the game-makers wrote their protection systems for. So my floppy drive would load any of the software of the day. I literally bought a game, had to hack away the protection, and then I could play it on my computer. Of course no one will believe me when I say this but damnit, its the truth! Now get off my lawn.

Re:Uphill Both Ways (1)

headkase (533448) | more than 4 years ago | (#31468218)

...would not load any of the software of the day...

Re:6510 Machine Language (0, Flamebait)

abigor (540274) | more than 4 years ago | (#31468270)

Hope you like high-level because the day is coming that it will be illegal to be low-level without a government (or more likely Apple) license.

Kook alert

Re:6510 Machine Language (1)

headkase (533448) | more than 4 years ago | (#31468426)

I remember pre-DMCA quite well. That would have been considered "kook"-ish in the hey-days of "Fast Hack'Em" copying software of the '80s.

Invert rose-tinted-glasses (0, Flamebait)

sakdoctor (1087155) | more than 4 years ago | (#31467968)

Here in 2010 it's not necessary to understand the whole computer, from the hardware up through device drivers and the kernel through to the high-level language that came from your apt repositories (even if it is only a python interpreter). Let's celebrate the rise of this sort of universal accessibility, to even the novice programmer.

Re:Invert rose-tinted-glasses (1)

pjwhite (18503) | more than 4 years ago | (#31468020)

Yeah, because drivers and kernels are perfect and never have bugs.

Re:Invert rose-tinted-glasses (3, Informative)

v1 (525388) | more than 4 years ago | (#31468088)

for certain the drivers/os back then were less buggy - they were smaller and so much less complex. It was a fairly simple matter for someone to have full understanding of the entire OS and sum it up in under 50k. (and I mean BYTES, not LoC)

Re:Invert rose-tinted-glasses (1)

eulernet (1132389) | more than 4 years ago | (#31468618)

Probably, but the Basic was written by Microsoft, and the ROM took 16 Kb (not counting the code when using a disk-drive).

Since an opcode is between 1 and 3 bytes, there would be around 8000 to 10000 LoC.

Re:Invert rose-tinted-glasses (1)

headkase (533448) | more than 4 years ago | (#31468792)

Two be specific the 16K of ROM was split into 8K for the Basic ROM and 8K for the Kernel ROM.

Re:Invert rose-tinted-glasses (3, Insightful)

tepples (727027) | more than 4 years ago | (#31468030)

Here in 2010 it's not necessary to understand the whole computer

Unless you want more throughput: manipulate more objects at once, serve more users at once, etc. If a Python program is spending most of its time in interpreter overhead, you may need to recode inner loops in C.

Or unless you're programming for an 8-bit microcontroller roughly as powerful as a Commodore PET. Not all "computers" are as powerful as PCs or even smartphones.

Re:Invert rose-tinted-glasses (1)

icebraining (1313345) | more than 4 years ago | (#31468620)

If a Python program is spending most of its time in interpreter overhead, you may need to optimize the Python JIT compiler


Yes, I know it's harder, but it's the Right Way(TM), in my humble opinion.

Re:Invert rose-tinted-glasses (1)

NewbieProgrammerMan (558327) | more than 4 years ago | (#31468706)

If a Python program is spending most of its time in interpreter overhead, you may need to optimize the Python JIT compiler


Yes, I know it's harder, but it's the Right Way(TM), in my humble opinion.

I didn't think the official Python implementation had a JIT compiler.

Re:Invert rose-tinted-glasses (1)

icebraining (1313345) | more than 4 years ago | (#31468746)

Who said you needed to use the official Python implementation?

Re:Invert rose-tinted-glasses (1)

Trivial Solutions (1724416) | more than 4 years ago | (#31468044)

Kinda like "What part of 'God' do you not understand?" :-)

Re:Invert rose-tinted-glasses (4, Insightful)

Puff_Of_Hot_Air (995689) | more than 4 years ago | (#31468078)

Here in 2010 it's not necessary to understand the whole computer, from the hardware up through device drivers and the kernel through to the high-level language that came from your apt repositories.

It wasn't necessary then either. The point is that you could. Now this is no longer possible. There are pros and cons to this, we can acheive more by building on the magical black boxes, but there was something deeply satisfying about knowing a device in such depth. The same can still be acheived in the embedded realm, however, and due to modern advances, it's cheaper and smaller than ever!

Re:Invert rose-tinted-glasses (2, Interesting)

TheRaven64 (641858) | more than 4 years ago | (#31468408)

Absolutely. This is the point of, for example, the STEPS project at VPRI, which aims to build an entire working system (kernel, GUI, dev tools, apps) in under 20,000 lines of code. Some of the stuff they've produced, like OMeta and COLA are really impressive.

Adding complexity to a system is easy. Adding features without increasing the complexity is much harder, but much more useful and rewarding.

Re:Invert rose-tinted-glasses (1)

Darkness404 (1287218) | more than 4 years ago | (#31468318)

Let's celebrate the rise of this sort of universal "accessibility" which leads to more viruses, more "PC" stores like Best Buy/Geek Squad and in general the decline of the computer. Yeah, its great if you are like Geek Squad and can charge $50 to plug in a USB cable and install drivers ( see [] ) for everyone else? It is a nightmare.

Re:Invert rose-tinted-glasses (1)

MichaelSmith (789609) | more than 4 years ago | (#31468388)

Even in those days I am sure there were lusers happily doing their jobs on a VT100 or similar terminal. Starting with a menu or simple command set was not so different from a GUI today.

Totally outdated... (-1, Troll)

Anonymous Coward | more than 4 years ago | (#31468036)

Today we have garbage collectiors in Java and that is why the C64 is completely outdated.

Everyone who still writes code on the C64 instead of Java won't get a job.

You probably don't even know the latest words.

Re:Totally outdated... (2, Interesting)

phantomfive (622387) | more than 4 years ago | (#31468062)

But.......anyone who would have trouble going without garbage collection, and couldn't learn how to code on a C64 probably shouldn't have a job. And anyone who can code on a C64 should have no problem adjusting to garbage collection.

Re:Totally outdated... (2, Informative)

tepples (727027) | more than 4 years ago | (#31468066)

Today we have garbage collectiors in Java

Garbage collectors in Java are good for collecting objects that own only memory but lousy for collecting objects that own resources other than memory. Resource leaks happen whenever someone forgets to close() something in a finally block, just as they do in C++ when someone forgets to delete or delete[] in a destructor or in the exceptional path of a constructor.

Everyone who still writes code on the C64 instead of Java won't get a job.

Unless you're in charge of making a PC or phone port of classic C64 games. Then you need to know both Java and C64 assembly language.

Re:Totally outdated... (0)

Anonymous Coward | more than 4 years ago | (#31468224)

"Unless you're in charge of making a PC or phone port of classic C64 games. Then you need to know both Java and C64 assembly language."

Initially I thought "C64 assembly language" was a goof for 6502 assembly, but you're actually right - you need to know C64 memory model and IOs.

Re:Totally outdated... (5, Funny)

MillionthMonkey (240664) | more than 4 years ago | (#31468650)

Dude, back when I was a kid and had a C-64, I wrote a JVM for it. Unfortunately I had trouble, because while the JVM standard defines long as not being threadsafe (as a sop to 32-bit architectures), it defines operations on int, short, char, and Object references as being atomic. So I had to write single-threaded code to simulate multiple threads just to get the garbage collection to work. And my char mappings didn't support Arabic and Chinese- you had to stick with PETSCII.

I was so embarrassed in front of my friends when my games paused intermittently to clear out kilobytes of garbage from the little heap. They were like, WTF, what is it doing, and I said, give me a break, it's Java. The only program I ever really got to work right was my C-64 emulator.

Re:Totally outdated... (1)

jimmydevice (699057) | more than 4 years ago | (#31468770)

BZZZZZT! You have failed your retro recollections. The C64 WAS single thread and all byte/word ops were atomic. If you want threaded code, use FORTH. Remember to push x/y/a/s on a context switch.

Re:Totally outdated... (1)

larry bagina (561269) | more than 4 years ago | (#31468262)

I make a lot more doing embedded work (last year I did a job on the 6502) than I would make with java.

Re:Totally outdated... (4, Insightful)

tzanger (1575) | more than 4 years ago | (#31468732)

Today we have garbage collectiors in Java and that is why the C64 is completely outdated.

Everyone who still writes code on the C64 instead of Java won't get a job.

You probably don't even know the latest words.

Oh, I don't know about that. There are thousands of embedded systems that need programming and require the kind of thorough knowledge of the hardware that you got from the old C64 days. There are more 8-, 16- and 32-bit systems without enough memory to run something like Java than there are PCs and higher class systems.

Don't pooh-pooh the old ways. They're what's running your world.

All of the 8 and 16bit machines were knowable (3, Interesting)

jmorris42 (1458) | more than 4 years ago | (#31468048)

It was possible to fully understand all of the old 8 and 16 but machines. Now you could spend momths trying to fully understand one video card, which would be replaced by something more complex by the time you finished understanding it.

And that was a big part of it, the stability of the platforms during that era. A C64 was exactly the same as every other one, a Tandy Coco was identical to the million others of it's kind. Later models tended to retain as close to 100% backward compatibility as possible so knowledge and software tools retained value. Now you buy a lot of PCs with the understanding that a year from now you won't be able to buy more of the exact model even if you stick to Optiplexes and such that promote the relative stability of the platform. Something will be slightly different. So, understanding being impossible we abstract it all away to the greatest extent possible.

If you want to reconnect with low level look at AVR microcontrollers. If you are really frugal you can get going for $20.

Re:All of the 8 and 16bit machines were knowable (1)

WrongSizeGlass (838941) | more than 4 years ago | (#31468146)

It was possible to fully understand all of the old 8 and 16 but machines.

Personally I prefer my butts to be over 18, but to each his own I guess.

Re:All of the 8 and 16bit machines were knowable (1)

Kvasio (127200) | more than 4 years ago | (#31468394)

You could still run notepad.exe and cal.exe from Windows 1.0, sir!

Re:All of the 8 and 16bit machines were knowable (1)

eulernet (1132389) | more than 4 years ago | (#31468584)

It's not really exact, since the C64 existed in 2 forms: one for the US and one for the Europe.

I vaguely remember that it introduced a difference in the fast disk-loading routine mentioned in a message above, because there was one cycle of difference (yes, simply a NOP).

If somebody is interested, I can dig in my very old source code to retrieve this information (I coded several games for the C64 in the years 1985-1988).

I miss those good 'ol days (4, Insightful)

v1 (525388) | more than 4 years ago | (#31468054)

Though my experience was on the Apple II not the Commodore. Little things like writing your own device drivers, drawing graphics via direct access to interlaces vram, (oh the maths!) direct read latch access to the floppy drives, writing hybrid assembly/BASIC apps. It was grand.

It's downright depressing to compare my present-day knowledge of computers, classify myself as somewhere in the upper 2%, and still wish I knew a quarter as much (percentage-wise) about my current computer as I did about my //c.


Re:I miss those good 'ol days (2, Interesting)

Anonymous Coward | more than 4 years ago | (#31468234)

I think it's there if you want it. A couple of weeks ago I was writing in assembly language specifically for one of the top ten fastest computers in the world. Sure, there's a shitload I don't know about that cluster -- no idea if there's an RJ-45 port, much less how I'd access it via the assembler, but I could probably find out if I cared and got clearance for it.

It's easy to romanticize simpler times, and there is some truth to them being simpler, but I'm damn excited about today. I mean, really, you can the guide for ARM assembler and pick up a phone or other portable, and have a ridiculous amount of computing power that you can bend to your will. OS and application bloat is either for a purpose (e.g., to make a computer easier to program for, or to add more features for less work) or a result of laziness, but on most systems you can get down to ground level if you really want. It's just that most people do not.

Re:I miss those good 'ol days (1)

MobileTatsu-NJG (946591) | more than 4 years ago | (#31468290)

It's downright depressing to compare my present-day knowledge of computers, classify myself as somewhere in the upper 2%, and still wish I knew a quarter as much (percentage-wise) about my current computer as I did about my //c.

Can you get more done with a program today than you could with the Apple II?

Re:I miss those good 'ol days (1)

TheRaven64 (641858) | more than 4 years ago | (#31468464)

One of the usability tests that Jef Raskin proposed was the time from turning a computer on to having written a program for adding two numbers together. On the C-64, a reasonably fast typist could do this in around 20-30 seconds. On a modern computer, you might finish the boot / resume process in that time, then you need to launch an app, then you can start writing. If you use something like Visual Studio / Eclipse / XCode, it will take a lot longer than if you use a terminal (expr $1 + $2 will do it on a *NIX system), and probably more than 20-30 seconds to even get to the point where you start writing code.

Re:I miss those good 'ol days (1)

MobileTatsu-NJG (946591) | more than 4 years ago | (#31468634)

Err okay... so are you rebooting and launching Visual Studio enough times that it takes longer these days to write a program than it did back in the C64 days?

I must admit I am disappointed. I thought I was going to hear about how all the abstraction and use of libraries/modules/etc meant that we can create useful apps in a fraction of the time that we could in the 80's. I had no idea it actually took longer!

Re:I miss those good 'ol days (1)

FiloEleven (602040) | more than 4 years ago | (#31468708)

That's right; one of the usability tests was boot-to-use time. I'm sure it was far from the most important, and there are changes in the way computers are used now that make it almost obsolete. For instance, my MacBook Pro might take over a minute from cold boot to an open terminal window, but I reboot it only rarely. Most of the time I just close the lid and let it sleep, so wake-to-use time is under 5 seconds; less if I already had a terminal window open.

The answer to the grandparent's question is, "yes, absolutely."

Just say embedded (1, Informative)

Anonymous Coward | more than 4 years ago | (#31468068)

All the stuff we learned on the VIC-20 and Commodore 64 (and PET) still applies in the embedded world. If you could program them, you can totally do a PIC or an Atmel AVR (or Arduino).

Web Server / App Farm (4, Funny)

Frosty Piss (770223) | more than 4 years ago | (#31468136)

This guide is *GREAT!*

I've got 6 web sites and 25 mission critical apps running on a cluster of 10 Commodore 64s. It started out with just one, but as our business expanded, we just kept adding them on. It would be a bear to migrate to MS-DOS or Windows 1.0, but maybe it's time. The acoustic coupler modem is a bit slow, but it's been working for us since 1990, it's hard to justify the upgrage.

Re:Web Server / App Farm (1)

TheRaven64 (641858) | more than 4 years ago | (#31468484)

You're joking, but the Contiki operating system runs on the C-64 and has a web server. It also has a working IPv6 stack, so you can serve your web pages over either IPv4 or v6 from a C-64.

If you miss the 8-bit era... (4, Informative)

Anonymous Coward | more than 4 years ago | (#31468070) yourself some Atmel microcontrollers (ATMega8 is a good choice). This will be instantly familiar to anyone who programmed assembly language on the C64. There are some differences, the Atmels aren't Von-Neumann architecture but Harvard architecture (separate program and data address space) and the CPU has more registers, but there is excellent hardware documentation, the complete command set and detailed register descriptions in the data sheet. There are lots of interesting application notes (IR decoding, interfacing to PS/2 keyboards, LCD output, ...). The Arduino system is based on an Atmel microcontroller, so there is also a big application oriented community in addition to the people coming from the electronics side.

It's not a toy either. These controllers are everywhere. Have fun and learn a useful skill...

Its still possible.. (3, Insightful)

ickleberry (864871) | more than 4 years ago | (#31468080)

To know a computer from the ground up, as it were. Its not some long lost dream or anything, after all your average disposable Crapple Netbook with "clout computing" or Octocore Core i13 and a half is just a fancy C64 with more CPU instructions, more memory, more peripherals that runs faster.

Its just that unless you start off at the low level, learning about transistors and that sort of shizzle and learning assembly language you probably will never bother to learn it. A lot of programmers now think about functions, objects and arrays as if they actually exists - not just a convenient way of presenting blocks of code and data in a way that makes it easy for you to understand. Fuck it, a lot of people fairly high up on the IT scene have no clue in the wide world about TCP or UDP but they sure as hell know how to write a 'Web App' using JSON and the latest Web 2.5 gimmick completely oblivious to any of the lower levels.

The problem is when you have nearly everyone going for the latest abstraction layer, easy low hanging fruits (at the expense of efficiency and everything else - rabble rabble rabble) high level stuff there might be a day 2110 when they're still using HTTP + more abstraction layers but quantum computers aren't getting any faster for what they need and nobody knows knows any of the low-level stuff anymore. If you are the one kid on the block who knows how to write assembly in 2110 you'll be a rich man by 2111, just in time for the end of the world cause the Mayans were off by 100 years.

Re:Its still possible.. (0)

Anonymous Coward | more than 4 years ago | (#31468378)

Nope, sorry, it's not possible to know all about a computer anymore. You may know the principles of the usual abstraction, but the details are too numerous and complicated for one person. Just some buzzwords: microcode, out of order execution, cache coherency, gated clocks, register renaming, MMU, SMM. Those are all in the CPU and even understanding these few concepts in detail would be challenging to anyone who's not designing chips for a living. Then there are all the other components in a computer: bus systems (with DMA, IRQs, ...), video card (OpenGL/Direct3D APIs, CUDA, video timings, VGA, DVI, HDMI with encryption, ...), USB (isochronous/interrupt/bulk transfer modes, different speeds, electrical characteristics...). The list goes on and on. Even knowing what there is to know is almost impossible.

In the 8-bit era, you could physically build a complete computer yourself, write every byte of its software, know all registers and their interactions, all peripheral connections, literally everything. You did not have to guess many clock cycles an instruction would take at any time. If there even was a variation, you knew exactly what the possible influences were and how to control them.

Re:Its still possible.. (2, Interesting)

Jah-Wren Ryel (80510) | more than 4 years ago | (#31468542)

You are only partly right. All of those things are knowable and learnable within a reasonble length of time - the problem is getting the documentation to know them. Too much documentation is locked up as proprietary info, either behind a paywall or an NDA wall.

Re:Its still possible.. (1)

bhtooefr (649901) | more than 4 years ago | (#31468728)

You could learn pretty much everything about a headless Solaris/SPARC system.

OpenSPARC T2 to learn about the CPU (and chipset, it's a SoC,) OpenSolaris to learn about the OS.

Graphics would be the only thing in a normal system that wouldn't be included.

Re:Its still possible.. (5, Informative)

TheRaven64 (641858) | more than 4 years ago | (#31468534)

Octocore Core i13 and a half is just a fancy C64 with more CPU instructions, more memory, more peripherals that runs faster

Possible, but nowhere near as easy. I've read most of volume 3A of Intel's architecture reference while doing background reading for my Xen book, but the complete architecture reference is well over 3,000 pages. The GPU reference - if you can get it - is a similar length, and that's before you get to the OS. The Design and Implementation of the FreeBSD Operating System is 720 pages. It's a good book, but it skips over a lot of details. The copy of the X11 protocol reference that I read was several hundred pages, and it's a few revisions old. The OpenGL reference was a similar length. But now you can do 2D and 3D graphics and, once you've read the C spec (not so bad, only a couple of hundred pages) and spent some time familiarising yourself with your C compiler and standard library you can draw things.

To get the level of understanding that the original poster is talking about, on a modern computer, means reading and remembering around 10,000 pages of reference books, and gaining familiarity with the source code that they mention. And that's just going to give you one CPU architecture and the core bits of the OS.

Re:Its still possible.. (2, Interesting)

phantomfive (622387) | more than 4 years ago | (#31468598)

I don't know man, it's a lot of work. On my computer I have Ruby, Python, Perl, GCC, and who knows what else installed. There are tons of APIs that I don't know what they do, even in the languages that they do know. I also have a webserver, and FTP server, and probably several other servers. They aren't running right now, but they came with the system.

On top of that, I have postfix config files, a mach_kernel file, and a bunch of other weird files that are either quite complex (this book about Postfix is 288 pages [] ), or I have no idea what they do, or they are binary and I have no hope of ever figuring them out. Even if I switch to my Linux partition, where I have the source code to everything, it's a lot of work to understand every single file in even just the Kernel. I'm not sure anyone even understands the Kernel itself completely. I haven't talked about hardware yet, but Intel processors do some tricky out of order operations and pipelining and such, it's not always easy to predict what is going to happen on one of those things. It is a lot of knowledge, and I am not sure anyone actually does understand it today, even if it is possible. No one I know makes that claim.

This is really different from the days of the C64, where the entire thing was only 64k (actually more with paging). You can read the entire memory contents of 64k in an afternoon, literally everything on the computer. You could definitely understand all the 'source code' (except the source code was in assembly) to the entire system. Predicting what the processor would do and how long it would take wasn't hard. You could fit everything about the system (even schematics to the hardware) in a single book.

Re:Its still possible.. (0)

Anonymous Coward | more than 4 years ago | (#31468616) stop using cliffhanger subjects. Really, give it a try; your readers will thank you for respecting them more.

Re:Its still possible.. (0)

Anonymous Coward | more than 4 years ago | (#31468766)

Blame consumerism... I remember my very first introduction to computers: a book (something by PC/Computing, "How a computer works" IIRC) which described only the hardware: modern silicon transistors, how CRTs work, how FDD/HDDs work, what bits/bytes are, difference between serial/parallel etc with pretty, simple pictures in only a few colours. Only a few years later I first touched one of those things, but by then the book was already worn out! It was one of my first books at that. It probably is responsible for my computing interest today.

The point being, such introductions which emphasise the human side of computers (what you see, touch etc) makes fiddling around with computers a much more gentlemanly affair like music or painting. Nowhere is work mentioned.

I wish more books like that were still around.

Misty-Eyed Nostalgia (2, Insightful)

GaryPatterson (852699) | more than 4 years ago | (#31468082)

It's lovely to remember what was, but not so great to forget what we have today.

Sure, we generally don't know the whole widget from top to bottom, but it's a hell of a lot easier to get a program up and running. It's not just frameworks either - the choice of languages we have today beats the crappy BASIC we had then, or the assembly language tools we had.

The first machine I knew inside-out was the ZX-Spectrum. While I like to remember it fondly, I would never want a return to those primitive times.

It's a bit like object-oriented programming - we hide the details of an object and only deal with the interface. It's more scalable and leads to faster development.

Re:Misty-Eyed Nostalgia (0)

Anonymous Coward | more than 4 years ago | (#31468308)

And when people like you write programs with your "faster development", you end up making the rest of us waste time waiting for your inefficient programs, because due to your willful ignorance about what functions actually do, you didn't realize what was wrong with "for (i = 0; i < strlen(s); i++)".

It's fucking sad that computers are actually slower than they were 20 years ago thanks to people like you.

Re:Misty-Eyed Nostalgia (1)

headkase (533448) | more than 4 years ago | (#31468544)

I'll take a guess at what is wrong with that code: the strlen() function needs to count the number of bytes in the string on each iteration of the loop. That takes time. A faster way to do it would be to assign the result of the strlen() function to a variable and use that variable in the loop construct. That way it would only be computed once.

Re:Misty-Eyed Nostalgia (1)

TheRaven64 (641858) | more than 4 years ago | (#31468558)

you didn't realize what was wrong with for (i = 0; i < strlen(s); i++)

Nothing, with a decent compiler. The loop-invariant code motion pass will ensure that the strlen() call is not invoked more than once. The simplify lib call pass may even remove the strlen() call entirely if the length of s can be determined at compile time.

Re:Misty-Eyed Nostalgia (1)

tzanger (1575) | more than 4 years ago | (#31468764)

It'd be pretty straightforward to circumvent a compiler's idea that the length of s is invariant for each loop pass. Some pointer math could probably do it rather handily.

I'm not saying that that's *right*, but relying on the compiler to optimize away your indiscretions is a rather poor way to write code.

Re:Misty-Eyed Nostalgia (1)

xtal (49134) | more than 4 years ago | (#31468350)

If you want to know a platform inside out, there's a fully documented open-source linux kernel staring there at you in the face.

Go get any of the dozens of embedded arm kits, any of the GREAT bits of documentation, and dig in. You want to get dirty with the hardware? U-boot is right there.

Want to pay with SRAM and gates? A $100 FPGA will get you all you need. Including a VGA out. We made a fully HDL Pong Game; including the VGA DAC out of a $20 part.

Hell, for ALL that matter, go get a Gameboy (any one) for peanuts, a loader cartridge, and hack away!

I'm as much for nostalgia as the next guy, I cut my teeth on a Vic-20 and a C64; I'm an EE because of the experience. I wouldn't go back for a second though. For the price of my first computer you could get yourself a really nice embedded development shop and do some pretty cool stuff.

It's a great time to be interested in this. There will always be good programmers, and there will never be enough of them.

Re:Misty-Eyed Nostalgia (1)

JaredOfEuropa (526365) | more than 4 years ago | (#31468704)

Nostalgia, and attention spans. Sure, our attention spans, of us, the old farts!

We do not know the whole widget anymore from top to bottom like we did with the 8-bit machines of our childhoods... but if I talk to kids with an interest in computers today, they know a great deal more about the nitty-gritty of modern machines than I do. Sure they aren't taking a soldering iron to the motherboard anymore like I used to do with the C64 to make some sort of interface, instead they stick some components on a breadboard and plug it right into a USB socket... and then proceed to write a driver for whatever it is they whacked together. What I really miss is having the time and dedication to do that sort of thing... but I don't. You are right, computers are different but no less accessible to tinkering. What has really changed is us

Why lament? (1)

smackenzie (912024) | more than 4 years ago | (#31468116)

Why "lament the decline of that kind of deep knowing?" Shouldn't we just encourage teens, students, hobbyists, computer science majors (e.g., anyone with an interest in this kind of thing) to get out there and buy a C64 or a kit or an open source game machine or an embedded device or any of the other numerous projects in which we could pursue "deep knowing"?

Frankly, it's a great time to be interested in computers: absurd amounts of power for cheap, _along with_ easy access (thank you Internet) to kits, information, software, books, older devices, embedded devices, game devices, community help, etc.

It doesn't have to be either or.

Want to read Programming the Commodore 64? (5, Informative)

gklinger (571901) | more than 4 years ago | (#31468122)

Should anyone wish to download an electronic copy (PDF) of Programming the Commodore 64 by R. C. West they may do so from DLH's Commodore Archive [] . It's a community supported archive of Commodore-related printed materials (books, magazines, newsletters, manuals etc.) and it could use your support. Enjoy.

Re:Want to read Programming the Commodore 64? (0)

Anonymous Coward | more than 4 years ago | (#31468300)

...and he might as well be hosting that file on a C64. Server is already buckling.

Still got my two C=64 machines! (1)

Terminus32 (968892) | more than 4 years ago | (#31468156)

Awesome stuff! Got a 1200 baud Compunet modem too, haha!!!

It still is possible (0)

Anonymous Coward | more than 4 years ago | (#31468192)

"Back in 1985 it was possible to understand the whole computer, from the hardware up..."

It still is, many of us do. A good place to start is with an embedded system.
Starting with an FPGA even opens up things that were Black Boxes to us in the 80's.

A PC is not any more complex. Really.

Windows being closed has had _exactly_ the effect RMS feared that led to the GNU
project, and directly to this story. If it were not for that, I think many would agree that
the situation is actually better today.

Mmmh, (0)

Anonymous Coward | more than 4 years ago | (#31468246)

I've learned programming on a C64, first BASIC, then assembler. 6502/6510 Assembler was a pain. 3 Registers for data, the ACCU and 2 others, only 8-bit and a very simple command set, which made everything a pain. Later I got an Amiga with the Motorola 68000 CPU and felt like a high level language compared to the C64 stuff, since you had a lot of 32-bit Registers that you could use at your will, a complex command set and lots of addressing modes, so it was much easier to transfer the stuff you had in mind directly into code.
Dealing with the OS and the hardware got more complicated though. The C64 hardware and OS were rather simple and you owned the machine with your program. No supervisor mode / ring-something, multi-tasking, multi-user etc. stuff that makes things complicated. But rather stuff like reading and writing hardware registers at your will even from BASIC with PEEK and POKE. Those were the days. lol.

Re:Mmmh, (0)

Anonymous Coward | more than 4 years ago | (#31468322)

I think one of the good things of learning coding that way was that I never had any problems with pointers in C++, because if I do low-level stuff, I almost see things from the perspective of the CPU.

Kudos to the C-64 (1)

PDG (100516) | more than 4 years ago | (#31468250)

I credit going through elementary school with a Commodore 64, one of the few in my school that couldn't actually afford one, for my advanced engineer position I have now. I spent so much time hacking away basic programs and stuff that I ended up learning so much computer science without even realizing.

Its the only explanation I have for how I've been a software engineer for my entire post school career (the past dozen plus years) while my undergrad degree was a BA in English.

Re:Kudos to the C-64 (2, Funny)

PDG (100516) | more than 4 years ago | (#31468326)

So much for my English background that I can't even proofread my post properly. Should have said "one of the few in my school that could actually afford one"

They did less. (1)

tjstork (137384) | more than 4 years ago | (#31468258)

I sometimes wax for the simplicity and deep understanding of the machines that I grew up with, but, the fact is, today's PCs just do more. They aren't just faster and more memory, they are also that much more complex. Sure, the instruction set of AMD64 is not that much more than the instruction set of an older machine, but that's only because its suffficiently well organized between opcodes and address modes that you can learn it. But after that, there's the whole peripheral story. How many people -really- want to know the ins and outs of how a PC Express bus works. Or USB. Forget video cards - what about sound cards. Even with data sheets programming a driver is a tremendous challenge.

And that's just at a hardware level. I mean, seriously, it wasn't that uncommon for people to write line editor or field editors for 8 bit machines or even for DOS text mode, but that's because everything was fixed width and height and you controlled everything, even the cursor. Nowadays, I doubt there's but a handful of genuine home grown edit controls for GUI environments - everyone just uses the widget that comes with it, and for good reason - they are mind boggingly complex for such a seemingly simple thing. You have to know about the current font, the events with the mouse, could easily blow a 1000 lines of code on it.

The 64 was a Godsend (0)

Anonymous Coward | more than 4 years ago | (#31468264)

I ran an entire computer supply company on a 64 - well over $500,000 in sales.
It worked and it did what I needed it to do.
That said I'd rather not revert to those days.

Anybody here see the episode of Harvey Birdman where the Jetsons came back from the future to sue the past for screwing up the planet?

Effieciency (0)

Anonymous Coward | more than 4 years ago | (#31468294)

If we were able to achieve the level of efficiency that programmers in 1985 got from their hardware on the hardware that exists today, we ourselves would be amazed.

Finally! (1)

AmigaMMC (1103025) | more than 4 years ago | (#31468310)

I've been waiting for this guide... like... forever. Now I can finally finish that work project

Answer: (0)

Anonymous Coward | more than 4 years ago | (#31468478)

Use Linux.

Some areas are lacking, but the existing documentation while now and then incomplete, are a wonderful treasure trove of learning joy.

In my Linux history I particularly appreciated:

- the explanation about modelines (howto);
- meeting vi again (I had used it decades ago);
- the hacking spirit which existed long ago (example: Woz), reborn in Linux developers;
- the versatility one has in KDE;
- all that I learned about networks, though I'm still very weak about this subject;

etc. etc.

Actually, amazingly, someone long ago ported Linux (kinda) to the Commodore 64:

Those were the days! (1)

SoundGuyNoise (864550) | more than 4 years ago | (#31468530)

Typing in programs line by line from a book or from Compute! Gazette to animate a moon landing, or play Basketball Sam and Ed (I think they were called).

Anyway, even Mad Magazine eventually published a program, but it never worked. :(

Best covered by Ellen Ullman in 1998 (4, Informative)

rbrander (73222) | more than 4 years ago | (#31468568) []

Ellen Ullman was a programmer for a full career before she discovered she was also a talented writer. The above link is to a article that was basically an excerpt from her excellent book, "Close to the Machine".

She writes about getting a PC and stripping off Windows, DOS, everything, until the (old even for 1998) BIOS is saying "Basic Not Loaded", then building Linux on it.

Her conclusions do sound a smidge "kids these days" when she writes about modern programmers that only know libraries and IDEs, but I know the /. gang will love it:

"Most of the programming team consisted of programmers who had great facility with Windows, Microsoft Visual C++ and the Foundation Classes. In no time at all, it seemed, they had generated many screenfuls of windows and toolbars and dialogs, all with connections to networks and data sources, thousands and thousands of lines of code. But when the inevitable difficulties of debugging came, they seemed at sea. In the face of the usual weird and unexplainable outcomes, they stood a bit agog. It was left to the UNIX-trained programmers to fix things. The UNIX team members were accustomed to having to know. Their view of programming as language-as-text gave them the patience to look slowly through the code. In the end, the overall "productivity" of the system, the fact that it came into being at all, was the handiwork not of tools that sought to make programming seem easy, but the work of engineers who had no fear of "hard."

I do recall some /. (or maybe it's in Salon) commenter at the time who replied, "Yeah, and your Dad thinks you're a weenie because you don't know how to wire transistors on a circuit board, and his Dad thinks he's a weenie because he can't wind the copper wire around his own inductors". Which is fair enough. Even log cabins can't be made without manufactured tools unless you can mold a kiln from clay and smelt iron for the axe yourself.

Still, the point of the desire is to have *maximum* control of the level of tool you are able to work directly with. The philosophy was echoed by Neal Stephenson in his essay, "In the Beginning Was the Command Line", the googling of which I will leave to the student. It's on-line.

That evokes so many great memories... (1)

aniemeye (838294) | more than 4 years ago | (#31468610)

..., almost like finding a long lost and cherished childhood toy. I remember my Apple II days, programming assembler, writing routines to directly address the disk drives. Then my project for the computer science class we had in High School (Germany 1984), where I had gotten my hands on a disassembly of the entire ROMs and from that figuring out how the mathematical formula interpretation worked (simulated stack and all). Finally, a year later, getting my hands on the schematics and figuring out how the video logic actually worked. Only once did I feel that powerful again in my life, when I recreated that feeling in 1996 by designing, building and programming an embedded 68000 computer for a piece of medical diagnostic equipment. Designed the system, built the prototype, debugged the hardware, routed the PCB (by hand, we had no money), wrote a round-robin multitasking system in assembler and finally the actual application as well. Still have a copy. That, and the discovery of sex :)

And it's about time, too! (0)

Anonymous Coward | more than 4 years ago | (#31468738)

Now I can get rid of that old VIC-20!

The Deep Magic is still there... (2, Insightful)

mindstrm (20013) | more than 4 years ago | (#31468804)

The deep knowledge is still there - the well is just a LOT deeper, and more complex.

In the days of the C64 - it was reasonable for a skilled and/or curious programmer to get to the bottom of things and learn how everything worked, exactly. It was also potentially USEFUL for him to do this.... it was the only direction you could go, short of inventing a new language.

So - today we still find deep knowledge out there - but it just not be as useful for even a very good programmer to go ALL the way down.

Yes, a Java programmer should know more than just the surface - and more than just the patterns. He could also go deeper and understand the JVM implementation, and to a degree how it uses actual machine resources - but to suggest he needs to all the way down the rabbit hole is taking it a bit far.

My point, I guess, is that there is no need to pine for the old days - nobody says you can't learn more and go deep, and those who do, tend to prosper.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?