Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Developer Creates DIY 8-Bit CPU

Soulskill posted more than 5 years ago | from the now-that's-impressive dept.

Hardware Hacking 187

MaizeMan writes "Not for the easily distracted: a Belmont software developer's hand-built CPU was featured in Wired recently. Starting with a $50 wire wrap board, Steve Chamberlin built his CPU with 1253 pieces of wire, each wire wrapped by hand at both ends. Chamberlin salvaged parts from '70s and '80s era computers, and the final result is an 8-bit processor with keyboard input, a USB connection, and VGA graphical output. More details are available on the developer's blog."

cancel ×

187 comments

Sorry! There are no comments related to the filter you selected.

I can do 2^1 better (5, Funny)

physicsphairy (720718) | more than 5 years ago | (#28148657)

I own a two-bit computer. My dad gave it to me. I know it is two bits because before he gave it to me he would often remark "I hate this ******* two bit computer."

(Yes, it is also reproductive.)

Re:I can do 2^1 better (0, Troll)

Anonymous Coward | more than 5 years ago | (#28148667)

"(Yes, it is also reproductive.)"

Which is more than you can say for the average /.er.

But does it run Vista? (2, Funny)

VampireByte (447578) | more than 5 years ago | (#28148669)

Sorry, I just had to ask.

Probably not. (1)

Philip_the_physicist (1536015) | more than 5 years ago | (#28148681)

Crysis, OTOH, is fine

Re:But does it run Vista? (2, Informative)

dunkelfalke (91624) | more than 5 years ago | (#28148685)

not a single version of windows was designed to run on 8 bit cpus.

Re:But does it run Vista? (-1, Troll)

Anonymous Coward | more than 5 years ago | (#28148699)

That would explain why 3.1 was such a crock of shite. Or maybe you're a crock of shite, or just a stupid asshat kraut?

Re:But does it run Vista? (-1, Offtopic)

Anonymous Coward | more than 5 years ago | (#28148705)

Wooooosh

Re:But does it run Vista? (1)

NusEnFleur (460584) | more than 5 years ago | (#28148719)

None, ran 8 bits cpus. As for the design itself, I am not so sure anymore.

Re:But does it run Vista? (5, Informative)

Vanders (110092) | more than 5 years ago | (#28148785)

According to the fine article it has a 24bit address bus and an 8bit data bus, but presents everything via. a 16bit ISA. It's a bit like a 8088.

Of course the ISA is probably nothing like an x86, so it still wouldn't run [MS|PC|DR|Free]-DOS anyway. Apparently it does have a C compiler, so perhaps you could port Bochs or Qemu to it and then run DOS on that. Emulated. On a TTL CPU running at 2Mhz (2Mhz slower than the original IBM PC). Maybe not then.

Re:But does it run Vista? (4, Insightful)

Krneki (1192201) | more than 5 years ago | (#28149177)

Good enough for Space Invaders.

Re:But does it run Vista? (1)

zombie_monkey (1036404) | more than 5 years ago | (#28149653)

I would love to run somethign like Contiki on it.

Re:But does it run Vista? (0)

Anonymous Coward | more than 5 years ago | (#28149241)

not a single version of windows was designed to run on 8 bit cpus.

Because four instances of Windows can run in parallel on an 8-bit CPU.

Re:But does it run Vista? (3, Funny)

Zero__Kelvin (151819) | more than 5 years ago | (#28149537)

"not a single version of windows was designed to run on 8 bit cpus."

Which is only logical, given that no version of Windows was developed to run on 32 bit or 64 bit CPUs either. Which versions of Windows were designed to crash on various CPUs is another matter ;-)

(Lighten up over-sensitive Windows Weeinie's with mod points. It is a joke)

Re:But does it run Vista? (0)

Anonymous Coward | more than 5 years ago | (#28149655)

(Lighten up over-sensitive Windows Weeinie's with mod points. It is a joke)

Just not a very good one...

Re:But does it run Vista? (5, Insightful)

Anonymous Coward | more than 5 years ago | (#28148725)

Why does this question keep coming up? Of course it does. It's turing-complete, so it's just a matter of writing the software. Not for the impatient users, obviously.

Re:But does it run Vista? (-1, Offtopic)

lobiusmoop (305328) | more than 5 years ago | (#28148809)

Um, just to be pedantic, doesn't a Turing Machine have infinite storage capacity (i.e. infinitely long tape)? The question then really is whether or not this system can have extra storage connected (e.g. a USB hard drive plus the driver software fitted into the limited RAM) in order to do this.

Re:But does it run Vista? (3, Informative)

uglyduckling (103926) | more than 5 years ago | (#28149285)

He didn't say it's a Turing Machine, he said it's Turing-complete, which means that (in theory) it can ran any conceivable program, with the obvious limits of RAM/Disk.

LBA-complete (2, Informative)

tepples (727027) | more than 5 years ago | (#28149383)

He didn't say it's a Turing Machine, he said it's Turing-complete, which means that (in theory) it can ran any conceivable program, with the obvious limits of RAM/Disk.

The limits of memory make the proper term "LBA-complete" [wikipedia.org] , where the required memory varies linearly with the size of the input, rather than "Turing-complete". But in practice, "Turing-complete" means LBA-complete.

Re:LBA-complete (1)

TheLink (130905) | more than 5 years ago | (#28149577)

But what if it's networked? Then humans and other stuff can send it all sorts of stuff.

The same input today could produce a very different output tomorrow.

Re:LBA-complete (1)

tepples (727027) | more than 5 years ago | (#28150087)

But what if it's networked?

In computational complexity theory, a "program" is most often taken to mean a subroutine that doesn't perform input or output. If you have input or output, such as network packets, the program ends, and it starts again after receiving the next packet. Two programs communicating over a network link are coroutines, but I'm not aware of any generalizations of complexity theory to cover coroutines.

Re:But does it run Vista? (4, Interesting)

morgan_greywolf (835522) | more than 5 years ago | (#28148825)

Well, your response applies slightly more to Linux (one would just have to implement a Linux kernel on the 8-bit CPU, which isn't likely to happen to anytime soon) and doesn't really apply to Vista at all. MIcrosoft would have to implement Vista, and unless there is sufficient market demand for this 8-bit CPU, they'll never do it, since the incentive for them to write an 8-bit Vista is approximately zero.

While it may be possible to write a Linux kernel for an 8-bit processor, this, too, is not likely, at least not a complete Linux kernel. Linux was pretty much designed and written from the ground up on a 32-bit processor with built-in low-level support for multitasking.

So, IOW, while you are theoretically correct, from a practical standpoint implementing Vista or Linux or any other modern OS, with the exception of FreeDOS, is virtually impossible. Hence, the GP's joke retains its original humor.

Re:But does it run Vista? (0)

Anonymous Coward | more than 5 years ago | (#28148839)

QEMU is open source software.

Re:But does it run Vista? (3, Insightful)

Anonymous Coward | more than 5 years ago | (#28148935)

any Turing machine can emulate any other. an 8 bit CPU can do a 128 bit calculation, it just has to do it in small steps. you would probably need a lot of disk space.

for example you store on disk the input and output of every transistor in a core2duo. then you iterate through each transistor and set the output according to the input. it may take a billion clock cycles to emulate one core2duo clock cycle, but its possible.

Re:But does it run Vista? (1)

morgan_greywolf (835522) | more than 5 years ago | (#28148979)

Okay, so you could write an emulator to make it run Vista. Very, very, very slowly. So it would take, what? Two weeks to boot?

Re:But does it run Vista? (1, Insightful)

Anonymous Coward | more than 5 years ago | (#28148987)

i think 2 weeks is optimistic :-)

Re:But does it run Vista? (2, Funny)

wisty (1335733) | more than 5 years ago | (#28149543)

Are we still talking about 8 bit machines?

Re:But does it run Vista? (-1, Flamebait)

rubycodez (864176) | more than 5 years ago | (#28149969)

but this machine is not a turing machine, it doesn't have unbounded storage. It doesn't even have sufficient storage to run Vista by any means. Ivory tower bullshit often gets smacked down by reality.

Re:But does it run Vista? (1)

astralpancakes (1164701) | more than 5 years ago | (#28149723)

Hence, the GP's joke retains its original humor.

Too bad it still isn't funny.

Re:But does it run Vista? (1)

CarpetShark (865376) | more than 5 years ago | (#28149063)

Well, that and another lifetime or so hand-crafting 1GB+ of memory chips, yeah.

Re:But does it run Vista? (1)

the person standing (1134789) | more than 5 years ago | (#28149297)

Emulating virtual address space would eat up most of the time. Besides the patience, the users lifespan might be the limiting factor.

Re:But does it run Vista? (1)

thethibs (882667) | more than 5 years ago | (#28149569)

Turing-complete is not enough if you have a user interface.

A lenient definition of "make" (-1)

Anonymous Coward | more than 5 years ago | (#28148703)

The functional components are all off-the-shelf TTL ICs. Of course that's still a big feat, but I was a bit disappointed because I had expected either a more discrete approach or a more integrated "make your own chip" approach.

Re:A lenient definition of "make" (4, Funny)

jcr (53032) | more than 5 years ago | (#28148721)

The functional components are all off-the-shelf TTL ICs.

Well, he is only one person, after all. Even if he was studly enough to build it from vacuum tubes, he probably wouldn't be winding his own filaments.

-jcr

Re:A lenient definition of "make" (5, Interesting)

grolschie (610666) | more than 5 years ago | (#28148749)

...Even if he was studly enough to build it from vacuum tubes, he probably wouldn't be winding his own filaments.

Uh, maybe not [virgilio.it] . :-)

Re:A lenient definition of "make" (0)

Anonymous Coward | more than 5 years ago | (#28148855)

What I always liked best about that video is the way he uses common, everyday tools that anyone would have lying around the house.

Re:A lenient definition of "make" (2, Funny)

hattig (47930) | more than 5 years ago | (#28150021)

Damn right, the lazy fucker should have dug the iron ore up himself, then dug up the coal to fire his home-built kiln so that he could create his own iron. Of course he would have had to hand-build bellows to create steel, and maybe he should have home built EVERYTHING ELSE in the production pipeline, including hunting his own food every night from his homebuilt house. Luckily he invented his own language to communicate with other people so that he could coordinate things!

Re:A lenient definition of "make" (3, Funny)

morgan_greywolf (835522) | more than 5 years ago | (#28148909)

I'm gonna get off his lawn for sure!

Re:A lenient definition of "make" (0)

Anonymous Coward | more than 5 years ago | (#28148949)

He's like the Chuck Norris of nerddom.

Re:A lenient definition of "make" (2, Interesting)

pHus10n (1443071) | more than 5 years ago | (#28149125)

Thank you for linking this video. Wow! This guy is the REAL Doc Brown lol. I watched every moment of it in awe. So that's how ol' school electronics was done :) Really, thanks!

Re:A lenient definition of "make" (1)

moonbender (547943) | more than 5 years ago | (#28149407)

Fantastic video, thanks for the link.

Re:A lenient definition of "make" (1)

goombah99 (560566) | more than 5 years ago | (#28149779)

Wow! thank you for that link. made my month.

Re:A lenient definition of "make" (1)

nurb432 (527695) | more than 5 years ago | (#28149089)

Tho you were being funny, one really could build this out of discrete semi conductor diodes quite easily.

People have been known to make their own tubes too ya know -> http://www.eetimes.eu/consumer/205801104 [eetimes.eu]

Re:A lenient definition of "make" (4, Insightful)

Anonymous Coward | more than 5 years ago | (#28148743)

That's the way it used to be done: I wouldn't claim that DEC engineers did not "make" the PDP-1 because it was constructed from wire-wrap discrete components. The only other sensible option for someone who wants to "make" their own CPU at home would be to program an FPGA, which is certainly less interesting than what this guy has done.

Re:A lenient definition of "make" (0)

Anonymous Coward | more than 5 years ago | (#28148775)

Is it? The difference is a lot of menial work and big mess of wires, but nothing fundamental: Buy a couple of logic gates, connect them to form a CPU. All the "magic" happens inside the chips either way. Neither approach will produce a working CPU when one morning you wake up on a deserted island.

Re:A lenient definition of "make" (1)

Vanders (110092) | more than 5 years ago | (#28148803)

The difference is a lot of menial work and big mess of wires, but nothing fundamental: Buy a couple of logic gates, connect them to form a CPU.

The "connect them to form a CPU" is sort of the trick there though isn't it? I could buy a bunch of TTL chips and "connect them up" but it sure as hell wouldn't do anything other than let the smoke out. It certainly wouldn't be a CPU.

Neither approach will produce a working CPU when one morning you wake up on a deserted island.

My chances of waking up on a deserted island are probably slim, but lets say I did: unless that desert island happens to have a small power generating facility, a couple of hundred working computers, a dozen CPU designers and fully working ASIC fab. complete with a fully trained work force, I don't think anyone is going to be making their own CPUs, no matter how you might try to do it.

Re:A lenient definition of "make" (0)

Anonymous Coward | more than 5 years ago | (#28149021)

it sure as hell wouldn't do anything other than let the smoke out.

See, that's what I mean. You treat TTL chips like there's some magic to them. There isn't. It's digital logic. Basically you connect them like you would connect them in a diagram on paper. If you want to add numbers, print out this page from Wikipedia [wikipedia.org] , look up which chips have the gates you need and wire them up. It's that easy. A lot of work, but not complicated at all. An FPGA is the same thing, just that the connections are "software-defined" instead of actual wires.

a small power generating facility, a couple of hundred working computers, a dozen CPU designers and fully working ASIC fab. complete with a fully trained work force

We're lucky that the aliens sent us all that stuff or we wouldn't have computers today.

Re:A lenient definition of "make" (0)

Anonymous Coward | more than 5 years ago | (#28149103)

You treat TTL chips like there's some magic to them.

Um, no. I'm saying doing it with wirewrap TTL is more interesting than doing it by programming an FPGA. It's more interesting and requires more skill. If I was suitably motivated I could learn enough Verilog to build a simple 8bit CPU on an FPGA, but it would take me significantly longer to learn the EE required to do it with TTL.

We're lucky that the aliens sent us all that stuff or we wouldn't have computers today.

Starting from a desert island it would take you a couple of hundred years (guess you'll have to figure out asexual reproduction along the way) as you'd have to re-invent the entire industrial revolution and re-discover all of the electrical, chemical and engineering skills required to build the first transistor. If you set the bar a little lower you'd need maybe 10 years less to build a triode, although you'd better hope your desert island has significant deposits of the various metals you'd require, especially copper and tungsten.

To put that another way: what the hell does that have to do with whether it's cool to build your own CPU with TTL components?

Re:A lenient definition of "make" (0)

Anonymous Coward | more than 5 years ago | (#28149181)

it would take me significantly longer to learn the EE required to do it with TTL

If you say so... Most people grasp the EE aspects of digital logic quite quickly. There practically aren't any, at least not at the level where you can get away with wires crisscrossing in multiple layers on a wire-wrap board.

Re:A lenient definition of "make" (1)

m50d (797211) | more than 5 years ago | (#28149239)

The only other sensible option for someone who wants to "make" their own CPU at home would be to program an FPGA, which is certainly less interesting than what this guy has done.

I'm not at all convinced. Sure, it'd be less likely to put your name on slashdot, but the part doing it on the FPGA removes - connecting all the wires together - isn't actually interesting; by and large, it's tedious manual labour. It's certainly worth knowing how to, but honestly once you've wired ten chips up you've pretty much "done them all". To my mind using an FPGA makes it easier to concentrate on the actually interesting part, the processor design, rather than getting caught up in the endless "wire A into hole B".

Re:A lenient definition of "make" (1)

confused one (671304) | more than 5 years ago | (#28149457)

The "make your own chip" approach is usually done in FPGAs these days. You can still buy a 6502, as a verilog program for an FPGA, if you really want it. As to actually processing your own silicon wafer, that can be done in your garage workshop, but is much harder.

Re:A lenient definition of "make" (1)

digitalunity (19107) | more than 5 years ago | (#28149627)

Good luck with that.

Nothing about your garage is similar to a clean room and god forbid you actually meet the requirements for a class 10 clean room, you wont have enough room left for any equipment!

Re:A lenient definition of "make" (1)

Jesus_666 (702802) | more than 5 years ago | (#28150049)

A clean room isn't that important when you use an 80 mm process.

Why? (0, Flamebait)

derspankster (1081309) | more than 5 years ago | (#28148715)

Another semi-creative way to waste your time. Lovely.

Re:Why? (1)

scotch (102596) | more than 5 years ago | (#28149093)

To vex morons like yourself.

Re:Why? (0)

Anonymous Coward | more than 5 years ago | (#28149361)

Wow, your comment reminds me of those keychains that say, "How do you keep an idiot amused for hours? Turn over..." Guess what is written on the other side.

But does it run Minix 1? (1)

Celeste R (1002377) | more than 5 years ago | (#28148731)

After all, it has to be better than MS-DOS!

It must be early.... (1, Offtopic)

Bentov (993323) | more than 5 years ago | (#28148733)

Not one "but does it run linux" ? I'm sooooo disappointed...

Re:It must be early.... (2, Interesting)

David Gerard (12369) | more than 5 years ago | (#28148765)

You might, however, be able to adapt LUnix [wikipedia.org] to it!

I would love to do that. (1)

jcr (53032) | more than 5 years ago | (#28148737)

If only I had a time machine and a highly extended lifespan, so I could spare a couple thousand hours on it...

-jcr

Correct! Six thousand cores (5, Funny)

Dachannien (617929) | more than 5 years ago | (#28148761)

built his CPU with 1253 pieces of wire

Farnsworth: Let me show you around. That's my lab table, and this is my work stool, and over there is my intergalactic spaceship. And here is where I keep assorted lengths of wire.
Fry: Whoa! A real live space ship!
Farnsworth: I designed it myself. Let me show you some of the different lengths of wire I used.

That's how we used to make all hobby computers (5, Insightful)

thomasdz (178114) | more than 5 years ago | (#28148769)

Yeah, baby...
Back before the days of the 4004, 8008, and 8080, when we built computers, we REALLY built computers.
None of this: take a pre-built-motherboard, add a pre-built-power-supply, add a pre-built graphics card...

oblig: get off my lawn

Re:That's how we used to make all hobby computers (5, Interesting)

hughbar (579555) | more than 5 years ago | (#28148805)

Exactly, I used to burn little holes in my shirts with the ferric chloride http://en.wikipedia.org/wiki/Iron(III)_chloride [wikipedia.org] that we were using to etch circuit boards (they were masked with wax or aluminium paint). My mother was not pleased with this.

Then, as a private project, I built a half adder with washing machine relays but it never worked properly because of power supply problems (which was the 12v transformer from my train set).

As parent already said (but they're so PESKY and disrespectful): Hey you kids, GET OFF MY LAWN

Re:That's how we used to make all hobby computers (3, Funny)

frdmfghtr (603968) | more than 5 years ago | (#28148919)

Then, as a private project, I built a half adder with washing machine relays but it never worked properly because of power supply problems (which was the 12v transformer from my train set).

Did you try using the "permanent press" cycle?

Re:That's how we used to make all hobby computers (3, Interesting)

earthforce_1 (454968) | more than 5 years ago | (#28149375)

The ROM in the computer on the moon lander used ferrite rope memory that was hand strung from bins of cores pre-programmed as '1's and '0's. The assemblers literally had a bin full of 1's and 0's.

Those where the days when assembly programming really meant just that.

http://en.wikipedia.org/wiki/Apollo_Guidance_Computer [wikipedia.org]

Re:That's how we used to make all hobby computers (2, Funny)

thomasdz (178114) | more than 5 years ago | (#28149555)

Ummmm... whoever modded me funny: I was being serious. (except for the "get off my lawn" part)
I cringe whenever I hear a young whippersnapper say "I built my own computer over the weekend" because "building a computer" to me doesn't mean slapping together pre-built parts.
(yeah, yeah, and I'm sure someone will respond to my thread saying when they were young, they built their own transistors from the sand in their backyard and smelted their own copper wires and I should get off THEIR lawn. :-)

Re:That's how we used to make all hobby computers (0)

Anonymous Coward | more than 5 years ago | (#28149955)

Yeah, yeah. And you had to run uphill both ways with woolen slippers for days to make static to charge up a bank of capacitors enough to run it for only a few minutes.

Watchout Intel (5, Funny)

mgblst (80109) | more than 5 years ago | (#28148795)

It is about time that Intel has some competition.

College Project (1)

Murdoch5 (1563847) | more than 5 years ago | (#28148799)

I'e done something like this for a college project, not as complex but we built a 68020 using wirewrap. My only question, why didn't he send away for a PCB after the design was done?

Re:College Project (1)

jacquesm (154384) | more than 5 years ago | (#28148897)

errmmm. pics or ... ;)

a 68020 !! That's a pretty impressive feat. What kind of chips did you build it from ? Regular series ttl or cmos or something at a higher level of integration ? Did it work ?

Given the fact that the chip was originally named (the 68000) for it's transistor count even if you were using fairly large 'blocks' it would likely have spanned many boards.

What clock speed did you get it to run on ?

Why did he do all this? (-1, Troll)

syousef (465911) | more than 5 years ago | (#28148829)

From the article:

"Why did I do all this?" he says. "I don't know. But it has been a lot of fun."

If he'd said almost anything else, I'd have been more impressed. To learn perhaps? To see if I could? Nup. Dunno.

Well in that case according to the article you just spent $1000 and 18 months for nothing? Come on, if you can do this, you can do better at explaining than that!!!

I can just imagine what my wife would think if I launched into doing something like this and couldn't even express why I wanted to do it. (She'd let me, she's great that way but there'd be lots of eye rolling and poking fun at me).

Er... WTF? (5, Insightful)

brunes69 (86786) | more than 5 years ago | (#28148883)

To have fun is not a good enough reason to do something?

I guess you would be happier if he was just another fat slob who "has fun" by watching American Idol?

This is quite possibly the most asinine comment I have seen on Slashdot in a long time.

Re:Er... WTF? (-1, Troll)

syousef (465911) | more than 5 years ago | (#28148933)

To have fun is not a good enough reason to do something?

He didn't say he did it to have fun. He just said it was a lot of fun. Not that he set out to do it.

I guess you would be happier if he was just another fat slob who "has fun" by watching American Idol?

No I'd be happier if his people skills came close to matching his geek skills.

This is quite possibly the most asinine comment I have seen on Slashdot in a long time.

Like I said: People skills and geeks.

By the way do you routinely reply to the most asinine comments you see???

This place gives me the shits lately. Things get modded up and down like a yoyo. If you want to be modded up you have to say something popular with the crowd. Kinda like being on American Idol. (Fuck knows that show isn't about singing ability).

Re:Er... WTF? (1)

Vanders (110092) | more than 5 years ago | (#28148967)

I'd be happier if his people skills came close to matching his geek skills.

What? All he said was "I don't know". It's an honest answer. You are the one who's being an ass about it.

Re:Er... WTF? (-1, Flamebait)

syousef (465911) | more than 5 years ago | (#28149303)

What? All he said was "I don't know". It's an honest answer. You are the one who's being an ass about it.

Go fuck yourself. That applies to all the morons who mod something down just because they don't like it.

Re:Er... WTF? (1)

Vanders (110092) | more than 5 years ago | (#28149455)

Go fuck yourself.

I'm busy ironing at the moment but I'll be sure to fuck myself later.

Here's an idea, maybe you wouldn't get modded down so often if you didn't act like an ass?

Re:Er... WTF? (-1, Flamebait)

syousef (465911) | more than 5 years ago | (#28149575)

Here's an idea, maybe you wouldn't get modded down so often if you didn't act like an ass?

Well maybe you wouldn't be told to go fuck yourself so often if you didn't go around calling people an ass?

Go look up hypocrisy in the dictionary. All I originally said was that the guy should be more articulate. You're the one who goes around telling people they're being an ass then wondering why they tell you to go fuck yourself and accusing them of being antisocial. Fucking idiot.

Re:Er... WTF? (0, Offtopic)

Vanders (110092) | more than 5 years ago | (#28149699)

But you are being an ass. Telling people to go fuck themselves just re-enforces that perception.

All I originally said was that the guy should be more articulate.

There was nothing inarticulate about his answer at all. He said he didn't know why he did it. You're getting your panties in a bunch because you didn't like his answer, but that's just tough nuts to you. Get over yourself already.

Re:Why did he do all this? (0)

Anonymous Coward | more than 5 years ago | (#28148995)

Well in that case according to the article you just spent $1000 and 18 months for nothing? Come on, if you can do this, you can do better at explaining than that!!!

He did it for fun. 18 months is 78 weeks. $1000 for 78 weeks of a fun hobby works out to be about $13 dollars per week. That's pretty reasonable for a hobby. A cinema visit with popcorn once per week would probably cost more.

.. so some slashdot halfwit could criticises him (2, Insightful)

Anonymous Coward | more than 5 years ago | (#28149077)

slashdot is really sucking hard these days.

the guy makes an off-the-cuff comment about his motivation, and some braindead moron on slashdot criticizes him for it.

HE MADE A FUCKING COMPUTER, YOU MOUTH-BREATHING DIPSHIT.

of course he had motivation! how could you sum up the motivation in just a couple of words?

what worthless piece of shit you are

Re:Why did he do all this? (1)

Capt. Skinny (969540) | more than 5 years ago | (#28149209)

I can just imagine what my wife would think... (She'd let me, she's great that way...).

She'd let you? And that's a virtue? Damn, man.

Re:Why did he do all this? (1)

syousef (465911) | more than 5 years ago | (#28149345)

She'd let you? And that's a virtue? Damn, man.

When you are time poor and share financial responsibility for a family, and you're the bread-winner, decisions like that are joint decisions. I know that's a difficult concept for some to grasp and you think that just means I'm under the thumb, but you know what, I couldn't do what she does for our child. We only have one child he's still an infant and he doesn't sleep much. She's had it very rough, gets very little sleep etc. So, yes, she's entitled to have some input into how I spend my time and our money. And yes, it is a virtue that she'll let me spend that time and money doing something she doesn't agree with or sees as pointless.

So you know what, if you want to belittle a guy for not being a selfish piece of fuck when he's got a family, go fuck yourself.

Re:Why did he do all this? (0)

Anonymous Coward | more than 5 years ago | (#28149429)

you're resorting to telling an awful lot of people to go fuck themselves over this article, and the guy from the OP is the one who needs better people skills? is that the best you can come up with? you don't know how else to express yourself? i'd expect someone who's not claiming any geek skills to have MUCH MUCH better people skills than that.

Re:Why did he do all this? (-1, Offtopic)

syousef (465911) | more than 5 years ago | (#28149597)

you're resorting to telling an awful lot of people to go fuck themselves over this article, and the guy from the OP is the one who needs better people skills?

I started off saying the guy in the article should be more articulate. For my trouble I got called an ass and got modded into oblivion. So yeah, I'm telling an awful lot of wankers to go fuck themselves and that does not in any way conflict with my message that these people need to get some people skills. I wasn't the one who started throwing around insults.

is that the best you can come up with? you don't know how else to express yourself? i'd expect someone who's not claiming any geek skills to have MUCH MUCH better people skills than that.

Says the fucker that posts as a/c. What the fuck are you saying here that you require anonymity??? Listening to you lecture me about people skills is like trying to listen to a monkey give a lecutre on nuclear physics.

Re:Why did he do all this? (1)

Zero__Kelvin (151819) | more than 5 years ago | (#28149609)

I see your posts all of the time, and it baffles me that more people don't respond to you and tell you to go fsck yourself the way the parent poster did! You are constantly posting ridiculous diatribes that often contradict each other. You are the scourge of Slashdot, and I really hope they cancel your account!

Re:Why did he do all this? (-1, Offtopic)

syousef (465911) | more than 5 years ago | (#28150051)

I see your posts all of the time, and it baffles me that more people don't respond to you and tell you to go fsck yourself the way the parent poster did! You are constantly posting ridiculous diatribes that often contradict each other. You are the scourge of Slashdot, and I really hope they cancel your account!

That's a pretty strong claim to make. Care to back it? Find me a handful of examples of where I contradict myself. You say I constantly do it so it should be easy to find say 5 examples? Clear examples mind you, not some bogus misinterpretted set of statements taken out of context. 5 clear contradictions. Bet you don't do it though.

I'm constantly amazed and the ridiculously anti-social reaction some slashdotters have to a point of view they don't like or agree with. What childish stupidity could have you post such a vile rant is beyond me. Do you REALLY hae nothing better to do? PATHETIC!!!

Re:Why did he do all this? (1)

Capt. Skinny (969540) | more than 5 years ago | (#28149435)

touche

Eek! Wire wrapping! (3, Insightful)

Sponge Bath (413667) | more than 5 years ago | (#28148847)

All the board level products I designed in the early 90s had to be prototyped with wire wrapping. Even if you are careful, by the time you do hundreds of connections it is almost inevitable there is some flaw. You might miscount a row of pins and attach to the wrong pin. The process of layering multiple connection to a single pin might damage a wire at the bottom. Wires might break or make a shaky connection that comes or goes.

I would not ever want to go back to that, but it did two useful things: The plodding physical process of "I'm now connecting this to that." forced a slow, comprehensive walk through of your design which can reveal design mistakes. The other is honing debugging skills of intermittent problems: "Is this a design flaw or a wire making poor contact?".

Re:Eek! Wire wrapping! (1)

Animats (122034) | more than 5 years ago | (#28149847)

Wire wrapping isn't that big a deal. I still have a wire wrap gun, although I haven't used it in years. You buy precut, wire in different standard lengths, and it comes with the right amount of insulation pre-stripped at each end. You have a wire list (A-15 to C-42, etc., and you just follow the wire list. It's usual to sort the wire list by length, so that you do the longest wires first. It's time-consuming, but not difficult. Far easier than ordinary hand-wiring.

Fully automatic wire wrap machines go back to the 1960s. That's how IBM mainframes were built. "Semi-automatic" wire wrap machines are still around; the human does all the work, and the machine checks them. If the wire wrap gun is in the wrong place, it won't wrap.

trusted computing (0)

Anonymous Coward | more than 5 years ago | (#28148863)

This is why "trusted" treacherous computing is doomed. It takes determination to build a simple computer, but once you have one, you can "bootstrap" with it - use it to design a more powerful computer.

Only by hunting down and killing everyone who is even vaguely smart enough to begin build computers could someone hope to make treacherous computing "work". I wouldn't put it past them to try, of course.

Re:trusted computing (0, Offtopic)

wisty (1335733) | more than 5 years ago | (#28149585)

Or they could teach them all .net, and make them read web forums. Then they are no longer smart enough.

Will take some time... (0)

Anonymous Coward | more than 5 years ago | (#28148875)

... to wire a Beowulf cluster of those...

Done before... in 16-bits (5, Interesting)

Elledan (582730) | more than 5 years ago | (#28148889)

Magic-1 [homebrewcpu.com] , a 16-bit TTL-based, wire-wrap PCB computer.

Slashdot posted an article on Magic-1 when it was completed years ago as well.

Real dedication! (1)

JohnMurtari (829882) | more than 5 years ago | (#28148905)

I remember a grad class that went into circuit design and register transfer language. I was amazed how opcodes could be made to twiddle bits. We just had a software simulator to try our designs -- I admire this guy for building it. Cool!

wire wrap vs. bread board (0)

Anonymous Coward | more than 5 years ago | (#28148923)

Never understood by bread boards replaced wire wrap.

Fantastic! (5, Interesting)

adosch (1397357) | more than 5 years ago | (#28148953)

There needs to be more Steve Chamberlin's in the world. Personal (or enterprise, for that matter) computing hardware has hit a mass exploitation mark; computers today have such an abundance of resources, storage and processing power, any developer I've had to work with in the last half of decade sees the computer, much like Steve mentioned in TFA, as "...like black boxes... and understand what they do, but not how they do it." which leads to blatant disregard for anything, really sloppy ways of coding and development, zero ideology or best practice on how to truly harness and control resources efficiently. I don't expect anyone to have a physics background or be some die-hard electrical engineer, but there's definitely something to be said for growing up and working with early computer models where you had to give two shakes about that stuff. This is very cool, indeed.

Re:Fantastic! (3, Insightful)

Vellmont (569020) | more than 5 years ago | (#28149481)


which leads to blatant disregard for anything, really sloppy ways of coding and development, zero ideology or best practice on how to truly harness and control resources efficiently.

I see lots of sloppy coding practices, but they have about nothing to do with hardware efficiency. (Think buginess and maintainability). Unless you're writing for specialized environments, the days of worrying about a few cycles of CPU, or a few kilobytes of memory are over (and good riddance).

The world has changed and the challenges have changed. It's great to design your own CPU, and I'm sure he learned an enormous amount. But to pretend that we should all be thinking like it's still 1979 is absurd.

newsworthy? (3, Informative)

SolusSD (680489) | more than 5 years ago | (#28149141)

This sounds like the kind of project any computer engineering undergrad curriculum would cover. Myself, I have had to design/build 4 different processors of varying complexity (basic mips, pipelined, superscalar, etc) during my years as an undergrad. Its cool nonetheless and by no means "easy".

WEll (0)

Anonymous Coward | more than 5 years ago | (#28149145)

this is old news for anyone who takes a look at hackaday once in awhile

Love that it is on wired (2, Funny)

WindBourne (631190) | more than 5 years ago | (#28149191)

Sounds like the right mag for this project, though make is the more appropriate one.

Not to disparage the accomplishment, but... (0)

nuckfuts (690967) | more than 5 years ago | (#28150031)

The article mentions "Z-80" among the parts used. The Z-80 [wikipedia.org] itself is an 8-Bit CPU.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>