Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

'Retro Programming' Teaches Using 1980s Machines

samzenpus posted more than 4 years ago | from the old-timey-processing dept.

Education 426

Death Metal Maniac writes "A few lucky British students are taking a computing class at the National Museum of Computing (TNMOC) at Bletchley Park using 30-year-old or older machines. From the article: '"The computing A-level is about how computers work and if you ask anyone how it works they will not be able to tell you," said Doug Abrams, an ICT teacher from Ousedale School in Newport Pagnell, who was one of the first to use the machines in lessons. For Mr Abrams the old machines have two cardinal virtues; their sluggishness and the direct connection they have with the user. "Modern computers go too fast," said Mr Abrams. "You can see the instructions happening for real with these machines. They need to have that understanding for the A-level."'"

cancel ×

426 comments

Sorry! There are no comments related to the filter you selected.

Niggers (0, Funny)

Anonymous Coward | more than 4 years ago | (#33372420)

Why don't niggers like to eat Tootsie Rolls? They keep biting their fingers off.

They'll just use them to play Elite all day (4, Funny)

elrous0 (869638) | more than 4 years ago | (#33372438)

That could teach them a thing or two about commerce and trade, I suppose.

Re:They'll just use them to play Elite all day (1)

object404 (1883774) | more than 4 years ago | (#33372782)

Testing for slower systems & lower CPU has become a big problem w/ us right now, especially since access to older "obsolete" machines is very very difficult now.

How do you guys do this? I mean with the only off the shelf PCs available running over 1GHz these days, how do you test for a 200-500MHz platform these days? Personally, I used nested VMs running a la Russian Dolls or matrix within a matrix within a matrix for you geeks who don't know what a VMs are. I was running Puppy Linux & DamnSmallLinux inside Ubuntu inside WinXP

Virtualbox [virtualbox.org] is very idiot friendly compared to VMWare and rockses sockses :)

Re:They'll just use them to play Elite all day (2, Insightful)

Nikker (749551) | more than 4 years ago | (#33372982)

Why not underclock the CPU?

Re:They'll just use them to play Elite all day (1)

Pentium100 (1240090) | more than 4 years ago | (#33373022)

How about making the program run quite fast on the 1GHz CPU? Then it may run OK with a lower processor, but if you make it just bearable on the 1GHz then it will be too slow on slower machines.

Also, new CPUs can reduce their clock speed to about half of maximum for power saving, so if you get a new 1GHz CPU, you can make it 500MHz.

Re:They'll just use them to play Elite all day (1)

hairyfeet (841228) | more than 4 years ago | (#33373052)

Uhhhh...you DO know there are places like Surplus Computers where you can pick up older machines like this 600 Mhz Tegra laptop [surpluscomputers.com] all day long, yes? It isn't like finding older machines is exactly hard. If you were talking about old SPARC desktops or maybe MIPS I'd agree with you, but x86 is as common as dirt and just as easy to find.

Re:They'll just use them to play Elite all day (1)

Red Flayer (890720) | more than 4 years ago | (#33373064)

Old working PCs are free (w/ pickup) if you browse freecycle or craigslist if you don't want to use a VM/emulator/underclock. Or you could visit one of the specialized sites that deal in old computers. I wouldn't want to waste the space on them, but I'm sure you could collect enough machines for your testing purposes quite easily.

Mid 1980's for me (2, Insightful)

Monkeedude1212 (1560403) | more than 4 years ago | (#33372440)

We had to remote into this old Unix System V box and do a few exercises for our course education. No, its not as far back as these students were going but it was helpful to become familiar with that kind of architecture, because you never know whats still going to be kicking around when you get on the job.

Re:Mid 1980's for me (1)

Ironhandx (1762146) | more than 4 years ago | (#33372572)

You really don't. You could end up with one of the two ARCnet plus deployments that were done.

Does that make sense ? (1)

razwiss (1823216) | more than 4 years ago | (#33372450)

I mean, seriously ? I'm pretty sure you don't need to screw your head on a crippling dinosaur to understand low level programming

Re:Does that make sense ? (5, Insightful)

Sarten-X (1102295) | more than 4 years ago | (#33372490)

Yes, it makes sense. The students get an intimate feel for writing programs without being able to waste resources ramapantly.

Re:Does that make sense ? (2, Insightful)

razwiss (1823216) | more than 4 years ago | (#33372600)

How will the student then apply his knowledge to modern languages such as Java, C# ? He'll have to optimize his code by doing a bunch of tests, just as he would have did without that class. With a flags and the time (in ms) required by each of the different methods, he will understand, for example, that quick sort is faster than bubble sort. And so it goes.

Re:Does that make sense ? (4, Funny)

jeffmeden (135043) | more than 4 years ago | (#33372686)

How will the student then apply his knowledge to modern languages such as Java, C# ?

It's really pretty simple. After seeing what a computer can do with code intimately optimized for the machine it's running on, they will be exposed to the status quo in Java or C# and their heads will explode. Problem solved on our end!

Re:Does that make sense ? (2, Funny)

Sponge Bath (413667) | more than 4 years ago | (#33372790)

...exposed to the status quo in Java or C# and their heads will explode.

Or at least cause their head veins to pulse as shown in this computer lab photo [treksf.com] .

Re:Does that make sense ? (3, Interesting)

Moridineas (213502) | more than 4 years ago | (#33372894)

I was in AP computer science over a decade ago. We used C++ using the "apstring" and "apvector" classes that were similar to the STL.

We of course had to implement bubble short, quicksort, insertion sort, and so.

It was fairly slow on our computers (386s/486s/maybe one pentium!) and you could REALLY see a visible difference between the difference sorts. It was very obvious.

I rewrote the sorts using standard C arrays instead of apvector. Even on those ancient computers, the differences were suddenly almost gone. Bubblesort using straight arrays was faster than apvector quicksort--at least for fairly small arrays. I don't remember the specifics anymore, but you had to be sorting IIRC several thousand things before there was much of a recognizable difference.

So yeah, that made a big impression on me. Then again that class, and intro classes in college were the last time I've had to write my own sorting algorithm...

I think it's a good thing that people who have maybe only used 2ghz+ computers are given a chance to experience something else. I guess a better question would be, why is expanding your horizons ever a bad thing?

Re:Does that make sense ? (1)

XnavxeMiyyep (782119) | more than 4 years ago | (#33373044)

I always thought it was weird that they made us use APVector and APString instead of regular Vectors and Strings. Was it just so College Board could feel better about themselves?

Re:Does that make sense ? (3, Insightful)

lgw (121541) | more than 4 years ago | (#33372938)

How will the student then apply his knowledge to modern languages such as Java, C# ?

Do you believe that a school should teach Java, or teach programming?

BTW, C++ can kernel-mode C programming jobs aren't going away, and tend to pay better than Java jobs as the talent pool is growing smaller. Especially for kernel-mode programming, very few schools are turning out bright young talent with any relevent skills in that area, so the labor pool is aging out but the demand isn't shrinking.

Re:Does that make sense ? (5, Interesting)

localman57 (1340533) | more than 4 years ago | (#33372660)

If you want to get an intimate feel for writing programs without being able to waste resources, try embedded systems programming. The microchip 10F series has only a few dozen bytes of ram, and a couple hundred words of flash. And no hardware multiply. Making it do useful things is an art. Oh, and unlike some relic from the 70's, you can actually get a job programming for tiny microcontrollers.

That said, it does seem like a cool class. One I'd like to take, but for personal interest, not professional development.

Re:Does that make sense ? (3, Informative)

xaxa (988988) | more than 4 years ago | (#33373034)

They're A-level students, i.e. the final two years of school, ages 16-17 and 17-18. It's probably more interesting than making some crappy VB application, which is what I remember the A-level computing students doing (I didn't do the subject, I did extra maths instead -- it was much more useful for finding a place on a good CS course at university).

Re:Does that make sense ? (2, Insightful)

loshwomp (468955) | more than 4 years ago | (#33373036)

If you want to get an intimate feel for writing programs without being able to waste resources, try embedded systems programming. The microchip 10F series has only a few dozen bytes of ram, and a couple hundred words of flash. And no hardware multiply. Making it do useful things is an art. Oh, and unlike some relic from the 70's, you can actually get a job programming for tiny microcontrollers.

Agreed on all of the above, but the experience of working on the relics will translate to modern embedded systems sufficiently well that I think there is value. In many cases the relics will be even slower and be more RAM- and ROM-constrained than all but the the tiniest of today's embedded microcontrollers.

Re:Does that make sense ? (1)

16K Ram Pack (690082) | more than 4 years ago | (#33372702)

But writing on a BBC Micro wastes resources. They just happen to be human rather than processor time (which is now a lot more expensive).

Re:Does that make sense ? (1)

Lobachevsky (465666) | more than 4 years ago | (#33372862)

The resources of students have negative cost. That is, they pay _you_ for you to use their resources (their brains).

Au contraire (1)

starglider29a (719559) | more than 4 years ago | (#33372766)

I have programmed on TRS-80s and 8088 w/8087s. Compiled C and Read & Go BASIC.

But now I'm programming python on an 8-core Xeon. When I'm writing a stored procedure or a nested loop of two recordsets, I ***STILL*** catch myself thinking about how slowly those instructions would take on a slower machine. "Do you know how LONG that looping will take?... oh. 0.000006 seconds. heh heh. I catch myself "subvocalizing" the loops, and I shy away from something "so resource intensive" and look for another, more efficient solution.

Yes, it's great to learn how a computer does what it does, but if you miss the simple solution because your mind is "read and go"-ing, then you hobble yourself.

Re:Does that make sense ? (1)

capo_dei_capi (1794030) | more than 4 years ago | (#33372858)

And the same thing couldn't have been achieved at home with an emulator?
Besides, these are high school students, not some university students taking classes in programming for embedded systems.

Re:Does that make sense ? (5, Insightful)

Darkness404 (1287218) | more than 4 years ago | (#33372576)

Yes, it makes perfect sense for two reasons.

A) It teaches people how to use unfamiliar hardware/software. Chances are the thing you are going to be running at your job is not going to be the thing you studied in university for.

B) It teaches kids how to not make mistakes in coding. Make a big enough mistake and the entire system goes down. Compilers are also a lot less fault tolerant.

C) It teaches kids how computers actually work by pealing back layers of abstraction. Think about it, has the average person under 20 ever used a CLI? For anything? I think the closest people come these days to actually using a CLI is typing in something on the Windows "Run" dialog.

D) It puts things in perspective. It shows how you don't need a Core i7 to play games, that a graphics card with 100 times the memory of the entire computer isn't required to make art, etc.

E) Its fun. The old computers had a lot more easter eggs built in and little tiny quirks. These days you get a Dell/HP/Gateway/Acer/Asus/etc slap Windows/Linux/OS X on it and its the same as any other Windows/Linux/OS X box, but the old computers all had little things different, some things were frustrating of course, but when you don't have to do it for any too serious of work, it can be kinda fun digging out the old Commodore 64.

Re:Does that make sense ? (1)

Darkness404 (1287218) | more than 4 years ago | (#33372616)

Ignore the 2 reasons part, I don't know what I was thinking when I typed that part up...

Re:Does that make sense ? (1)

blair1q (305137) | more than 4 years ago | (#33372738)

You were thinking of putting it in a loop.

Re:Does that make sense ? (4, Funny)

oodaloop (1229816) | more than 4 years ago | (#33372750)

Amongst our weaponry are such things as...I'll start over.

Re:Does that make sense ? (2, Funny)

Anonymous Coward | more than 4 years ago | (#33372812)

I thought we were supposed to pick the two we liked and ignore the others...

Re:Does that make sense ? (1)

LanMan04 (790429) | more than 4 years ago | (#33372802)

Think about it, has the average person under 20 ever used a CLI? For anything?

The average person under 100 hasn't ever used a CLI. But I think it's just as important to programmers now as it was 25 years ago.

Re:Does that make sense ? (1)

gander666 (723553) | more than 4 years ago | (#33372816)

I love this idea. I learned 6502 assembly way back, and spent time in grad school hand optimizing fortran subroutines to extract the optimal performance out of a well aged (in 1985) Cyber 730 system.

Most programmers I run into today have never once touched assembly, and many of them think that ANSI C is the same as assembly. While it has been a LONG time since I have coded anything, I have shown a couple coders some old school optimization techniques that blew their minds

Of course I know that modern, optimizing compilers can do about 90% of what we used to do by hand, but to extract the last 10%, you really need to know the architecture, the opcodes, and even the errata to get the best.

Re:Does that make sense ? (1)

clone53421 (1310749) | more than 4 years ago | (#33372856)

I think the closest people come these days to actually using a CLI is typing in something on the Windows "Run" dialog.

Most people don’t even use that. When the Run dialog opens, it contains the last thing that was executed from it, and on a non-technical user’s computer this always seems to be the last thing I executed from it however long ago it was that I used their computer last.

If you ask me, the closest thing people come these days to a CLI is the Location bar in Internet Explorer... maybe even the search bar in Google.

Re:Does that make sense ? (1)

retchdog (1319261) | more than 4 years ago | (#33372896)

Another way to get to this level of demand is to have them work on extremely large datasets using mathematically sophisticated models. Even on a modern computer, developing a sparse matrix-represented tensor from R^{60 000} dimensions to R^{7500^2} will take a bit of planning.

This will train (mostly) the same skill set and also prepare them for real work. Unfortunately, most CS teachers don't know jack about numerical optimization/statistics/data mining.

Instead of old machinery, they can just code for the MMIX machine right?

I don't have particularly anything against this retro programming plan, and I think it'd be a fun and useful experience. There are however many other ways to get people to think.

Re:Does that make sense ? (0)

Anonymous Coward | more than 4 years ago | (#33372604)

Exactly. That's why students in automotive engineering always start off with a garage full of Bugatti Veyrons and such.

Re:Does that make sense ? (4, Interesting)

tomhudson (43916) | more than 4 years ago | (#33372640)

It's not about "understanding low-level programming" - it's about having a direct connection between what you do and what happens. No virtual machine, no garbage collector, no super-fast compile/link/run/modify cycle (s you're going to take a few minutes to THINK about why something didn't work instead of just doing the "quick fix let's test it and see if we got it right this time" route).

screw your head on a crippling dinosaur

The article never said they were using Windows.

Better teach them C (3, Insightful)

mangu (126918) | more than 4 years ago | (#33372710)

you don't need to screw your head on a crippling dinosaur to understand low level programming

Absolutely. Better teach them C so they will know how data structures and memory management work.

Languages that try to do everything may help you write code faster but can be treacherous.

Let's see a simple example. In Python there is a subtle matter of memory management that can be dangerous to the untrained programmer. When you copy a list like this: a = b you are creating a pointer to the other list, when you copy like this: a = b[:] you are allocating memory for a new list and copying the contents.

When you know C, the difference between the two copy instructions above is obvious, but if you don't know what is memory management this can become very difficult to understand. I bet there are many bugs created by Java, Python, and other modern languages that come from this inability to understand how the language works under the hood.

Working on old computers can be fun for some people, but to train programmers nothing beats learning C. C is close enough to the hardware to let one understand the details of how software runs, yet abstract enough to represent any typical von Neumann computer.

Re:Better teach them C (1)

Lobachevsky (465666) | more than 4 years ago | (#33372956)

Exactly. Higher-level languages are _for_ people who already understand lower-level languages. Just as calculator are _for_ people who already understand arithmetic. Schools don't give calculators to kindergarteners or any child who hasn't yet understood arithmetic. First understand arithmetic, and demonstrate you do so by working _without_ a calculator, _then_ be allowed a calculator. Giving people some high-level language on super fast machines with "retina" pixel density and high-level languages is ludicrous when they don't understand the basics. Heck, I'd suggest children be taught on turing machines, and step through _in their head_ their own code. I know way too many programmers who can't play out their own code in their head. Alan Turing did not have access to computers, he developed and tested algorithms _in his head_.

Re:Does that make sense ? (4, Insightful)

lgw (121541) | more than 4 years ago | (#33372742)

Yes, it makes great sense. WHen getting started, it really helps if you're forced to deal with the low level, and more if you can actually see the low level.

I've spent a large part of my career writing software realted to tape drives. It really helped me getting started that I could sit down in front on an old IBM 9-track reel-to-reel to test my code. Not the most useful thing for production data storage, but terrific for seeing problems with production code. Miss the end-of-tape marker? Flap-flap-flap-flap doh!

Similarly, writing and debugging production assembly code made me very comfortable with debugging and crash analysis on higher-level languages, even if I didn't quite have matching source. And that experience in turn lets me understand "what really happens" with a language like C# or Java, and for example explain to people why, for example, the .NET file rename function is no substitute for the Win32 file rename system call, despite the fact "they both just rename a file". STuff that should be obvious to even a junior programmer but, well, isn't.

makes sense (5, Insightful)

grub (11606) | more than 4 years ago | (#33372454)


Makes a lot more sense than starting them off in some poo like Java where they never need to know about the real hardware.

Re:makes sense (0)

Anonymous Coward | more than 4 years ago | (#33372504)

There is no need to know anything about the hardware.

Modern programming is about algorithms and interfaces. Knowing how to simulate 4GB memory space with only 8 bit registers is not important.

Re:makes sense (1)

psbrogna (611644) | more than 4 years ago | (#33372688)

I disagree- Being exposed to how early x86 or other platform programmers worked around limits of the time teaches new programmers how to innovate and think outside the box (in addition to the very valuable insights on performance tuning & optimization on physical hardware mentioned by many others above).

Re:makes sense (2, Insightful)

Red Flayer (890720) | more than 4 years ago | (#33372714)

There is no need to know anything about the hardware.

Modern programming is about algorithms and interfaces. Knowing how to simulate 4GB memory space with only 8 bit registers is not important.

B-b-but the article isn't about modern programming. It's about the A-level program, which is about how computers work, as per TFS/A. Some people need to actually work on hardware in order for your modern programmers to implement their algorithms using an interface.

Re:makes sense (1, Insightful)

Anonymous Coward | more than 4 years ago | (#33372886)

It'll do you whipper snappers some good to learn about hardware.

Re:makes sense (0)

Anonymous Coward | more than 4 years ago | (#33373048)

And how are you supposed to implement a good interface or algorithm unless you understand what the compiler has to do to get you sourcecode to run?

There is a reason to why a modern text editor run sluggish on my 2.66GHz Quad-Core with 6GB ram but a similiar program feels snappy and instant on my old 50MHz system with 64MB ram.
My bet is that it has very much to do with "modern programming" and less with hardware.

You really need to learn how a processor works before you can understand the compiler. You also need to understand the compiler before you can select one algorithm or one interface over the other. (And there are plenty of programmers with a good understanding of both hardware and high level languages.)
You can refuse to learn about it if you want to. I don't really care. I will just use programs written be those who were willing to learn that little extra instead.

Re:makes sense (2, Insightful)

xtracto (837672) | more than 4 years ago | (#33372900)

Agree completely.

I like and use Java everyday but I would not suggest learning Computer Science with it.

I learned data structures and algorithms with plain C and pointers which IMHO is the proper way to do it.

Just recently I was reading a book about data structures and algorithms in Java, and it was very funny the loops people have to jump to create a simple linked list or stack... because Java is *not* done for that...

IMO: Great (4, Insightful)

hsmith (818216) | more than 4 years ago | (#33372472)

I feel us programmers have gotten too far away from the lower level aspects of the craft and are now too higher level focused. While, this isn't a bad thing (why should you rewrite a framework everytime you start a new application) - it really perverts ones respects for how things work and efficency.

I am getting back into assembly programming after 8 years of C# and it is a bit of a shift in thought. My college switched from C/C++ to Java my senior year for incoming freshman - a real shame. Programming is totally different when you have no respect of memory management.

Re:IMO: Great (0)

Anonymous Coward | more than 4 years ago | (#33372892)

My University did the same thing, though right before my freshmen year (luckily I learned C/C++ in High school). Two years later they switched the first year back to C, then let them get into Java after that.

'Modern computers go too fast,' (3, Interesting)

John Hasler (414242) | more than 4 years ago | (#33372476)

Uhm, we were saying that in the 1980s. At Bletchley Park they should be teaching with machines that actually are old.

Re: 'Modern computers go too fast,' (1)

rubycodez (864176) | more than 4 years ago | (#33373076)

indeed, some old-school vacuum tube computers pumped at 1KHz would serve the purpose so much better.

Just dump Windows and goto DOS (4, Interesting)

commodore64_love (1445365) | more than 4 years ago | (#33372482)

You would get exactly the same "feel" as you get with an old C=64 or Atari or Amiga machine. If your goal is to get down to the bare metal, then go ahead and do so. There's no need to dust-off old machines that are on the verge of death (from age).

Re:Just dump Windows and goto DOS (4, Insightful)

cptdondo (59460) | more than 4 years ago | (#33372770)

But DOS doesn't have all the neat tools.

I learned on a PDP-8; 16K of hand-strung RAM and a CPU slow enough that you could put an AM radio next to it and hear it compute.

This thing came with all sorts of neat tools, including assembler (of course) and a FORTRAN compiler.

You learned to program good, tight code and really, really thought about your data structures.

Sure, programming today is much, much easier and we can do lots more. I cringe when I look at FORTRAN IV code these days; it's painful. But it did teach me a lot.

Re:Just dump Windows and goto DOS (3, Funny)

Rand Race (110288) | more than 4 years ago | (#33372780)

Or you could just fire up a terminal ... Oh, Windows. Never mind.

I just dusted off a couple of C-64s, an Amiga 500, a Sun 3/50 and an Apple IIc the other day. They were filthy. And on top of the box of cables I needed to get at.

Re:Just dump Windows and goto DOS (2, Interesting)

Nethead (1563) | more than 4 years ago | (#33372866)

You're an old hacker and may relate to this. I found free on craigslist a hand built (hand wrapped) Z80 CP/M box with dual 8" drives and a case of diskettes. No instructions or schematics. This winter I'm going to dig into it with my scope and logic probe and see if I can get the old baby working again. I was amazed that I was still able to just look at what components were on the boards and get a fairly clear idea of how it was put together. I figure the hard part will be following the address lines and seeing where the memory and I/O is at. I did see a few 74LS138s on board so I guess that is where I'll start. Now to pull the old DOS box out of the shed with the ISA EPROM burner and see what's there. From what I understand the boot ROM was home brewed. This should be very interesting.

Re:Just dump Windows and goto DOS (1)

Red Flayer (890720) | more than 4 years ago | (#33372964)

These machines aren't that much older than a C64 anyway. The BBC Micro ran on the same 6502 as the Commodore PET, just one generation older than the C64. As I'm sure you know, the 6510 the C64 ran on is a 6502 variant...

For that matter, Atari 2600s also ran on a 6502, just with fewer pins. There were more 2600s sold than C64s... I'd be willing to bet the supply of still-working 2600s is bigger than the supply of working C64s.

I keep saying that (3, Interesting)

Anonymous Coward | more than 4 years ago | (#33372484)

... if you want to know how computers work, learn microcontrollers with the Atmel 8 bit family of controllers (ATMEGA8, for example). These things are wonderfully documented, there is a free C/ASM development environment with emulator (single-step, breakpoints, etc.). The real deal is just a few dollars for a development board (or get an Arduino, same thing). You don't get the absolutely down to the transistor insight, but that's really just a few experiments with TTL gate chips and LEDs away.

"Actors" (5, Interesting)

Applekid (993327) | more than 4 years ago | (#33372488)

They each took on the role of a different part of the machine - CPU, accumulator, RAM and program counter - and simulated the passage of instructions through the hardware.

The five shuffled data around, wrote it to memory, carried out computations and inserted them into the right places in the store.

It was a noisy, confusing and funny simulation and, once everyone knew what they were doing, managed to reach a maximum clock speed of about one instruction per minute.

I wish I had a teacher like this while in [US public] school.

Re:"Actors" (0)

Anonymous Coward | more than 4 years ago | (#33372876)

They each took on the role of a different part of the machine - CPU, accumulator, RAM and program counter - and simulated the passage of instructions through the hardware.

The five shuffled data around, wrote it to memory, carried out computations and inserted them into the right places in the store.

It was a noisy, confusing and funny simulation and, once everyone knew what they were doing, managed to reach a maximum clock speed of about one instruction per minute.

I wish I had a teacher like this while in [US public] school.

Even 8086 machines run at about 4MHz. That means a lot more than 1 instruction per minute.

IF you want to simulate 1 instruction per minute, just run a debugger. You can step 1 assembly instruction at a time!

Re:"Actors" (1)

sconeu (64226) | more than 4 years ago | (#33373086)

I did this several years ago for my daughter's 5th grade class. I wanted to teach them "how a computer counts to 5".

Sounds like fun (2, Funny)

elmodog (1064698) | more than 4 years ago | (#33372494)

Next up: Driver's Eduction on the Model T [wikipedia.org] . ;)

Re:Sounds like fun (2, Insightful)

Locke2005 (849178) | more than 4 years ago | (#33372540)

What would be easier to learn basic mechanics from working on: a '57 Chevy or a 2011 Lexus?

Re:Sounds like fun (4, Insightful)

jeffmeden (135043) | more than 4 years ago | (#33372796)

New cars are wonderfully simple under the hood, once you strip away all the plastic. Ever taken apart an old carburetor before? Ever try to get it back together in working order? Give me a FI computer, airflow sensor, and fuel injector any day. Not surprisingly, cars went from a maintenance interval of 1,000 miles with a life expectancy of 50,000 miles to a maintenance interval of 10,000 miles and a life expectancy of 250,000 miles by *avoiding* complexity.

Re:Sounds like fun (2, Informative)

localman57 (1340533) | more than 4 years ago | (#33372730)

There's a bit of truth to that Model T stuff. Everybody thinks they know how to drive a standard transmission until you throw 'em in an antique without synchros... I woudl guess you could say the same about automatic spark advance, but I've never personally experienced that...

Re:Sounds like fun (3, Informative)

Nethead (1563) | more than 4 years ago | (#33372966)

The model-T was even stranger: http://en.wikipedia.org/wiki/Model_T [wikipedia.org]

The Model T was a rear-wheel drive vehicle. Its transmission was a planetary gear type billed as "three speed". In today's terms it would be considered a two speed, because one of the three speeds was actually reverse.

The Model T's transmission was controlled with three foot pedals and a lever that was mounted to the road side of the driver's seat. The throttle was controlled with a lever on the steering wheel. The left pedal was used to engage the gear. With the handbrake in either the mid position or fully forward and the pedal pressed and held forward the car entered low gear. When held in an intermediate position the car was in neutral, a state that could also be achieved by pulling the floor-mounted lever to an upright position. If the lever was pushed forward and the driver took his foot off the left pedal, the Model T entered high gear, but only when the handbrake lever was fully forward. The car could thus cruise without the driver having to press any of the pedals. There was no separate clutch pedal.

The middle pedal was used to engage reverse gear, and the right pedal operated the engine brake. The floor lever also controlled the parking brake, which was activated by pulling the lever all the way back. This doubled as an emergency brake.

Dear Perry: (-1, Troll)

Anonymous Coward | more than 4 years ago | (#33372498)

"Visual Basic suggests words while a coder types, highlights syntax errors and makes bug hunts easier by jumping straight to the problematic code - even when the error is one of logic rather than letters."

For all the Visual Basic fanz: In case you've been living in President-VICE Cheney's spider-hole,
Visual Basic is DEAD because Microsoft Is Dead [paulgraham.com] .

I hope this dispels the idea of debugging in this infamous AND dead language.

Yours In Moscow,
K. Trout

History doesn't repeat itself (0)

VGPowerlord (621254) | more than 4 years ago | (#33372502)

In one of the first lessons held at TNMOC the lucky Ousedale students programmed a venerable PDP-8 machine by flicking the switches set on its front panel to set the binary values in its memory. And an interface does not get more direct than that.

What's next, punch cards?

Seriously, how is this useful in modern computing, other than as a "Back in my day..." quote?

If you want to actually teach them something, why not just teach them to program for an embedded system? Preferably, one that is still in use today.

Re:History doesn't repeat itself (1)

Carnildo (712617) | more than 4 years ago | (#33372614)

These systems have high-level I/O options (keyboard, monitor) that embedded systems don't, while avoiding all of the complexity (multi-level caching, speculative execution, out-of-order execution, etc.) of a modern desktop.

Re:History doesn't repeat itself (1)

John Hasler (414242) | more than 4 years ago | (#33372820)

> These systems have high-level I/O options (keyboard, monitor) that embedded systems don't...

Some embedded systems have serial I/O and support terminals.

Re:History doesn't repeat itself (5, Informative)

LWATCDR (28044) | more than 4 years ago | (#33372698)

Actaually the BBC PC isn't far from the perfect embedded system trainer.
From the Wilkipedia.
"The machine included a number of extra I/O interfaces: serial and parallel printer ports; an 8-bit general purpose digital I/O port; a port offering four analogue inputs, a light pen input, and switch inputs; and an expansion connector (the "1 MHz bus") that enabled other hardware to be connected. Extra ROMs could be fitted (four on the PCB or sixteen with expansion hardware) and accessed via paged memory. An Econet network interface and a disk drive interface were available as options. All motherboards had space for the electronic components, but Econet was rarely fitted. Additionally, an Acorn proprietary interface called the "Tube" allowed a second processor to be added. Three models of second processor were offered by Acorn, based on the 6502, Z80 and 32016 CPUs. The Tube was later used in third-party add-ons, including a Zilog Z80 board and hard disk drive from Torch that allowed the BBC machine to run CP/M programs."

Four A2Ds 8 bits of GIO, and switch inputs. All available from Basic on a machine with a Floppy, Keyboard, and Monitor. Sweet.
I so wanted one of these back in the day. Too expensive and not really available in the US at the time.

Re:History doesn't repeat itself (0)

Anonymous Coward | more than 4 years ago | (#33372724)

In one of the first lessons held at TNMOC the lucky Ousedale students programmed a venerable PDP-8 machine by flicking the switches set on its front panel to set the binary values in its memory. And an interface does not get more direct than that.

What's next, punch cards?

Seriously, how is this useful in modern computing, other than as a "Back in my day..." quote?

If you want to actually teach them something, why not just teach them to program for an embedded system? Preferably, one that is still in use today.

I was going to tell you embedded systems is a completely different subject from programming on the bare metal, but apparently they're using BASIC so, yeah, expect punch cards by the end of the month.
Still, I'd take the class given the opportunity.

Re:History doesn't repeat itself (4, Insightful)

American AC in Paris (230456) | more than 4 years ago | (#33372786)

Seriously, how is this useful in modern computing, other than as a "Back in my day..." quote?

Learning how to use older/simpler machines is an excellent way to learn about a number of fundamental concepts. Modern computing, for all its advances, still operates off the same fundamental principles as it did fifty years ago; it's simply become orders of magnitude more complex.

Now, while it's perfectly possible to learn how to do this sort of thing using emulation or specialized training software, there's real value to having an appreciation of the history of the field you're planning to enter, and working with machines that were once considered state-of-the-art is a very effective way to gain a sense of just how insanely far computing has come. Note, too, that simply because you're never going to be called upon to program a PDP-8 in real life doesn't mean that you can't learn a fair amount of generally-applicable knowledge about hardware, logic, branching, execution, input, output, and instruction sets. In fact, by pulling yourself out of a familiar environment, you're forced to pay attention to important things that you'd otherwise happily ignore--like "well, how does what is in my head actually get into a computer's inner workings?"

Finally, always remember that programming is a subset of computer science. Even if all you ever expect to do is write code, a deeper knowledge of what goes on between the compiler and the electrons is going to be quite useful--and will make you a better coder, to boot.

Probably a good idea (4, Interesting)

Anonymous Coward | more than 4 years ago | (#33372534)

We ran some older machines in my first programming course. When you can see the direct results in speed (or lack of) it can help teach better approaches. Writing a game and seeing the screen flicker when you ask the CPU to do too much is good modivation to find a more effectient approach. One our our instructors also did something like this with visual sorting procedures. If you can see the difference in speed between one sorting approach and another, it sinks in.

Re:Probably a good idea (4, Informative)

spiffmastercow (1001386) | more than 4 years ago | (#33372854)

That can much more effectively be done by concentrating purely about Big O rather than hardware tweaks. Just tell them to do problems from project euler [projecteuler.net] and they'll get a good appreciation for algorithm efficiency.

How does it work? (2)

PolygamousRanchKid (1290638) | more than 4 years ago | (#33372542)

The computing A-level is about how computers work and if you ask anyone how it works they will not be able to tell you

That's what most people would say when you asked them how something works. Computers, fermentation, a Wok . . . etc.

"Um . . . I dunno . . . "

Re:How does it work? (3, Funny)

tomhudson (43916) | more than 4 years ago | (#33372830)

The computing A-level is about how computers work and if you ask anyone how it works they will not be able to tell you

That's what most people would say when you asked them how something works. Computers, fermentation, a Wok . . . etc.

You don't understand - this is Bletchley Park, you know, the codebreakers during WW2. Old habits die hard. They *could* tell you, but then they'd have to kill you.

waste of time (1)

AffidavitDonda (1736752) | more than 4 years ago | (#33372548)

"Because there's no copy and paste, if you do something wrong it takes time to go back and fix it," said Joe Gritton. "You cannot take out sections and move them around."

So they have to worry about bad interfaces and editors. You can do simple assembler programming on modern machines as well. And understanding C__ should give you a good deal of knowledge about the basics of a computer...

Old trick... (2, Interesting)

ibsteve2u (1184603) | more than 4 years ago | (#33372556)

I wasn't alone in keeping '286, 386, and '486 boxes around until Pentiums became prolific...and the same goes for dual cores etc...you write code that runs fast on the older generations, and you never hear user-land complaints about your stuff's performance on the new.

Of course, with the advent of .NET....well, now you're only as good as Microsoft is.

Knowability (5, Insightful)

tverbeek (457094) | more than 4 years ago | (#33372562)

One of the great things about the early micros (and probably the even-earlier minis) is that they were Knowable. With a little time, an intelligent person could become familiar with the workings of the entire architecture. I used to have a map of every memory location in the 64KB of ye olde C64 (most of it was user RAM of course) explaining what each byte was for. POKE a different value to a certain address, and the background color changes. PEEK at a certain address and it tells you the current hour. You could learn this... all of it. Obviously that's just not possible with modern computers (probably not even modern phones); no one person can grok the whole system.

Re:Knowability (0, Troll)

Dunbal (464142) | more than 4 years ago | (#33372672)

With a little time, an intelligent person could become familiar with the workings of the entire architecture. I used to have a map of every memory location in the 64KB

      Fortunately we've moved on from the days when a coder had to know the IRQs, DMA channels and interrupts and had to waste time playing around with peeks and pokes (especially when some of them required you to insert (random number of) NOP's in between because the hardware wasn't fast enough). Now we have these things called drivers and libraries that do all the basic work for us, and we can just program the high level stuff. How cool is it that I can write a multi-threaded graphical program in about 10 minutes? Try doing that in Turbo Pascal 2.0.

      Yeah it means I've had to adapt and learn new languages, new APIs and different ways of doing things. Instead of learning about ports I learn about object interfaces and methods. But you just have to keep up, like any other field. Certainly in my other hobby - medicine, if you snooze you lose.

Re:Knowability (1)

ITBurnout (1845712) | more than 4 years ago | (#33372954)

This is true. I do NOT miss wasting hours with EMM386.

Re:Knowability (0)

Anonymous Coward | more than 4 years ago | (#33372712)

I still have "Mapping the C64". :-) Fond memories.

Re:Knowability (2, Insightful)

ITBurnout (1845712) | more than 4 years ago | (#33372936)

Beagle Bros Apple II Peek & Poke Chart FTW! Good memories dinking around with that.

The professor does have a point. I remember writing one of my first BASIC programs. All it did was wait for a key to be pressed, then fill all character positions on the screen with that letter. I could watch it go line by line. Then I wrote the same program in 6502 assembler. The entire screen of characters changed instantly, as fast as I could type. A feeling of awe and sudden empowerment rushed through my veins. Fun stuff.

Ridiculous (2, Interesting)

16K Ram Pack (690082) | more than 4 years ago | (#33372598)

The five soon discovered that just because a program was simple did not mean the underlying code was straight-forward. To make matters more testing, the BBC Micro offers a very unforgiving programming environment.

My first piece of commercial programming was on a BBC Micro and having that environment didn't teach me anything, it just made programming more of a pain than being able to cut and paste, set debug breaks and so forth. And it doesn't teach any more than using C#/VB because it's a machine designed around using BASIC, which is itself an abstraction (and IIRC, you didn't have functions, so had to endure the horror of GOSUB/RETURN).

Re:Ridiculous (1)

Nighttime (231023) | more than 4 years ago | (#33372694)

(and IIRC, you didn't have functions, so had to endure the horror of GOSUB/RETURN).

BBC BASIC II had PROCedures and FunctioNs. It was one of the better 8-bit BASICs.

Re:Ridiculous (1)

16K Ram Pack (690082) | more than 4 years ago | (#33372748)

It was a long time ago ;)

Re:Ridiculous (0)

Anonymous Coward | more than 4 years ago | (#33372864)

You recall incorrectly; BBC Basic always had PROC and FN.

>10 A%=FNtest
>20 PRINT A%
>30 END
>40 DEFFNtest
>50 = RND(5)
>RUN
                  5
>RENUMBER 0,0

Silly

Understanding "computers" vs "programming" (1)

Xygon (578778) | more than 4 years ago | (#33372652)

In my college program on electronics design, we actually did a lot of system-level programming on 8085 machines. We went so far as to build DAC converters to build volt meters out of computers. What it gave us was the ability to understand the basics of CPUs interacting with memory, signals, IOs, assembly language. That said, with two years of that under my belt, I am no closer to understanding any real-world programming that can get me a job, than had I not taken the class.

Yes, the basics of computers are much easier when you don't have massive clock frequencies adding insane complications to applications. No, programming has nothing to do with understanding what JMP does. Do I think it's valuable? Sure. Do I think it'll make any difference on whether you can call yourself a "programmer?" No way.

Re:Understanding "computers" vs "programming" (1)

dpilot (134227) | more than 4 years ago | (#33373006)

I did similar programming, connected with an EE class. We had our prototype board that plugged into the (dating myself) S100 "mainframe." Our lab projects were combinations of hardware and software. Few things teach you the value of time while computing better than writing a bit-banging UART. For your example, take your DAC and see what sort of AC frequency response you could get out of it. Then give the "A" in the class to the team with the best combination of transient response, frequency response, and overall accuracy.

My first job in the working world involved programming memory testers. At that point you drop down to microcode on an individual cycle basis, controlling timings within the cycle on pulse shaper cards. You had to know what each and every cycle was doing, and do them all correctly. Now THAT's low-level.

Edlin (3, Insightful)

stevegee58 (1179505) | more than 4 years ago | (#33372690)

Edlin should be a mandatory part of the course for the full immersive effect.

Or was that the 70's? Gosh I can't remember now cuz I'm so old.

Edlin (Grr....) (2, Insightful)

Primitive Pete (1703346) | more than 4 years ago | (#33372872)

I first used edlin on DOS 1.0 and was kept using it until better alternatives (norton edit, anyone?) appeared. Edlin makes vi seem like a walk in the park. I've used edlin for assembly and Pascal programming, and I say "curse you!" to anyone who jokes about those dark days.

Re:Edlin (1)

John Hasler (414242) | more than 4 years ago | (#33372914)

> Or was that the 70's?

1980: it came with early versions of MSDOS. I avoided ever learning it by using CPM/86 and Unix.

Emulation? (1)

jsebrech (525647) | more than 4 years ago | (#33372696)

Wouldn't it be better to use the emulation route? For example, writing a program for the original gameboy, and running it through the emulator. I remember at university we learned assembly on an emulated MIPS. We could focus on the individual instructions, on hardware that was simple and clean, but it all ran on the unix servers (x terminals).

crufty calculator? (3, Interesting)

chitselb (25940) | more than 4 years ago | (#33372762)

from the link: "using 30-year-old or older machines."
from the fine article: "First released in 1981; discontinued in 1994 using 30-year-old or older machines."

I recently (three weekends ago) fired up my Commodore PET 2001 [flickr.com] (a *genuine* pre-1980 computer) and have been writing a Forth [6502.org] for it. It's really a lot of fun, and I'm finding that 30 years experience in various high-level languages has improved my "6502 assembler golf" game a lot. It's very incomplete, but the inner interpreter mostly works. Feel free to throw down on it here [github.com]

Charlie

Been there, done that (4, Interesting)

BlindSpot (512363) | more than 4 years ago | (#33372804)

10 years ago when I went through University, the core of the mandatory Assembly programming course was taught on the PDP-11 architecture, then 30 and now 40 years old.

Granted it's not quite the same. We used emulators and not the real things. Also it was for different motivations. The prof felt it was simpler to teach the cleaner PDP-11 instruction set than the 80x86 or 680x0, although the course did eventually also extend to both. Also he happened to be an expert in systems like the PDP-11.

However the idea of using old systems as teaching aids is hardly new - or news IMO.

Beebs are good machines (2, Informative)

ed (79221) | more than 4 years ago | (#33372808)

I was using them at college when they were new.

My first job was writing software that controlled scientific instruments and their was an awful lot of eductaional software written for them because they were designed to be used in schools. The Basic was more structuured and it could use microcassettes or 5 1/4 flopies with its own DOS.

In short, if you are going to use a dinosaur, it is the best dinosaur to choose

For Sale (1, Funny)

Anonymous Coward | more than 4 years ago | (#33372978)

Timex Sinclair 1000 w/ 16KB Expansion Module. Manual included. Original boxes & styrofoam.

Code by Charles Petzold (1)

chocolatetrumpet (73058) | more than 4 years ago | (#33373060)

If this opportunity sounds interesting to you but you grew up with high level languages, definitely check out the book Code [charlespetzold.com] by Charles Petzold. I could really only keep up with the first 500 pages or so, but it is still incredibly insightful and interesting if you want to know how a computer really works. You could basically use it to learn how to build a computer from scratch and program it.

Um.. Microcontroller courses? (0)

Anonymous Coward | more than 4 years ago | (#33373062)

Am I missing something here? What's the difference between programming a 30 year old PC and programming an 8-bit or 16-bit microcontroller. Build your own OS as a while(1) loop... Understand volatile variables and interrupts... It's much more applicable today too since these systems sell in orders of magnitude larger volume than modern intel PCs... They literally exist in everything from your laptop to your wrist watch to your car or your microwave.

You can run one of these things at 1kHz or 100MHz.

Any embedded architecture course in a good ECE program will give you a healthy respect for memory utilization, interleaving processes on a processor, and deterministic processes to boot.

Cornell has a great one that I TA'd while I was a grad student
http://courses.cit.cornell.edu/ee476/

That page has tons of projects from previous students. All focused in the Atmel AVR 8-bit RISC architecture typically running at 16MHz max. One clock cycle per instruction for the most part. Projects range from video games to robots or RFID readers.

I'd say that that's a much better use of department resources. I came from embedded systems programming to PC based programming and I feel like I have an extra healthy respect for resource utilization, etc. When you can just insert assembly in line and take over absolutely everything the processor is doing, while handling peripheral interrupts, you can do some really awesome stuff.

Microcontrollers (1)

cpscotti (1032676) | more than 4 years ago | (#33373066)

I had the same kind of learning back in my days using PIC microcontrollers where I needed to do VERY LOW LEVEL programming and I saw "instructions happening for real", flipping a led's state or the like.
I value that experience a lot!

Brainf*ck (1)

Ukab the Great (87152) | more than 4 years ago | (#33373120)

Laugh if you will, but I find that the most effective demonstration of low-level "how computers really work" programming (short of flipping manual switches PDP style) is Brainfuck [wikipedia.org] .

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?