×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

From a NAND Gate To Tetris

Unknown Lamer posted about a year and a half ago | from the artisinal-programming dept.

Education 103

mikejuk writes "Long before the current crop of MOOCs (Massive Online Open Course) there was a course that taught you all you needed to know about computers by starting from the NAND gate and working its way up through the logic circuits needed for a computer, on to an assembler, a compiler, an operating system, and finally Tetris. Recently one of the creators of the course, Shimon Schocken, gave a TED talk explaining how it all happened and why it is still relevant today. Once you have seen what is on offer at http://www.nand2tetris.org/ you will probably decide that it is not only still relevant but the only way to really understand what computers are all about."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

103 comments

May I point out... (1)

Anonymous Coward | about a year and a half ago | (#41666495)

There's a circuit building platform which applies to this in the works:

http://www.circuits.io/

Re:May I point out... (1)

lindi (634828) | about a year and a half ago | (#41666961)

The point of nand2tetris is that you can understand all parts of the system. With circuits.io you can't even see the source code let alone have time to read through all of it. Last time I checked they did not let you to export your designs either (gerber export does not count, it's not something you'd want to modify).

Re:May I point out... (1)

skelly33 (891182) | about a year and a half ago | (#41672193)

I thought nand2tetris sounded pretty interesting. My hopes were deflated when I found that half of the reading material does not exist (chapters 7-12), and that the entire "computer" project is simulated virtual hardware. It's a bit ironic for the author(s) to so emphatica;;y profess a true understanding about computer hardware and then implement the entire concept in software. I was really hoping for some TTL breadboard / wirewrap / soldering... *something* tangible. It looks like interesting reading anyway, though there are already published books that cover most-to-all of this.

NAND Gate (4, Funny)

zippo01 (688802) | about a year and a half ago | (#41666505)

I watched this video and it really does seem like it would be a fun course! I'm not really sure about the whole God giving man NAND. Though that is prolly why my belief in God is 11. Hah, what a crappy NAND joke.

Re:NAND Gate (0)

Anonymous Coward | about a year and a half ago | (#41677479)

For the Skeptics/Atheists/Agnostics, for all those who doubt the divine origin of the nand gate, we have the humanist counter-arguement, the facts

http://www.gutenberg.org/ebooks/15114
(George Boole, "An investigation of the laws of thought")

Logic is Logic (5, Interesting)

thejuggler (610249) | about a year and a half ago | (#41666519)

Somewhere between learning to write my first "Hello World" program on the Apple IIe (and the TI99/4A) and making a career out of programming years later, I went to schools for Computer Repair and Bio-Medical Electronics. I still have a pile of 7400 series [wikipedia.org] IC chips and my breadboards amongst other electronic components. I learned analog and digital circuit design in the late 80's. The logic learned in those classes still applies to everyday programming today. No matter what I did in those previous careers, the training I did then still applies today. AND, OR, NOT, NAND, NOR, XOR and XNOR are still the 7 basic logic elements that make up all digital electronics and programming. From there Truth Tables are built and boolean algebra is applied to create any and all circuits and code today. In my humble opinion these are still essential to training people new to various IT fields. It's like having to learn nous, verbs, adverbs and adjectives in order to write understandable thoughts. If you lack this basic understanding learning the more advance concepts is difficult at best. It's good to see these are still being taught somewhere.

Re:Logic is Logic (0)

Anonymous Coward | about a year and a half ago | (#41666669)

I am glad I went to school and learned what a nous is. http://www.thefreedictionary.com/nous

Re:Logic is Logic (1)

Anonymous Coward | about a year and a half ago | (#41666733)

As a n00b, I just wondered: Are those 7 all the possible variations of the truth tables? (Except for the obvious TRUE and FALSE "elements" that always return true or false.)

Re:Logic is Logic (4, Informative)

sFurbo (1361249) | about a year and a half ago | (#41666909)

You have 2 inputs that can each be 1 or 0, so there are 4 different inputs. For each of those, you have 2 possible outputs, so there are 2^4=16 different truth tables.

Of these, 8 are symmetrical in A and B (gives the same output for input (1,0) and (0,1)). Theses are AND, OR, NAND, NOR, XOR, XNOR, TRUE and FALSE

The remaining 8 are 4 sets of duplicates (if you switch A and B, we call the gate the same name). These are A, not-A, A AND NOT-B, and its negation, NOT-A OR B. The two last does not seem to be standard gates, so no, there are, in fact, two more non-trivial truth tables for two inputs.

Re:Logic is Logic (0)

Anonymous Coward | about a year and a half ago | (#41667895)

NOT-A OR B is logical implication (i.e. "if A then B"). I've never seen it used as a logic gate, but it is a binary operator in some programming languages (e.g. Visual Basic).

Re:Logic is Logic (0)

Anonymous Coward | about a year and a half ago | (#41668347)

In programming, IF-THEN if different from the logical If-THEN:

In logic: IF A THEN B is equivalent with NOT A OR B
In (imperative) programming languages: IF A is TRUE THEN make B become TRUE (and give an error if that is impossible)

Re:Logic is Logic (1)

Teancum (67324) | about a year and a half ago | (#41668823)

More correctly and accurately the "logical if-then" really is modus ponens [wikipedia.org] .

Interestingly, in formal logic discussions (aka "philosophical logic") this is one of the most used "functions" as following the syllogistic pattern through modus ponens gives you valid arguments, with the exception of "A" being true and "B" being false.... which is the logical definition of a fallacy.

I had the good fortune in my academic career of landing in a formal logic class taught by a defrocked Jesuit priest. He knew next to nothing about computers or electronics, but he understood syllogisms and formal logic from a classical viewpoint and gave me insights into computer programming and electronics that I wouldn't have had otherwise. It also gave me clinching proof that computer geeks do need to take at least some classes in the humanities to learn about their craft.

I thought I was an expert in logic going into the class and certainly I could make truth tables in my sleep and perform all sorts of mathematical operations on boolean variables. I discovered I was just an infant intellectually on the topic after just a couple of days of listening to this professor.

Re:Logic is Logic (1)

travisco_nabisco (817002) | about a year and a half ago | (#41670029)

I was also fortunate enough to take a formal logic class, I believe it was called "Introduction to Logical Thought".

I enjoyed the class so much that I would advocate requiring some minimal level of formal logic be taught at the highschool level. It would be a lot more beneficial to make every student learn this than to make them all take pre-calc or calculus.

Note: I am not advocating limiting the math courses available to highschool students, I just think there are more applicable things that can be taught as required material.

Re:Logic is Logic (0)

Anonymous Coward | about a year and a half ago | (#41668961)

By DeMorgan, NOT-A OR B is the same as A AND NOT-B. This is a very useful circuit which enables you to clear bits in register A where there are 1s in register B, and leave the rest alone. The ARM has it in the form of the BIC instruction.

Re:Logic is Logic (5, Informative)

famebait (450028) | about a year and a half ago | (#41666927)

No, but it sums up all the useful/practical ones.
If you only have two inputs, there are only 4 rows in the table,:
| A | B |
| 0 | 0 |
| 0 | 1 |
| 1 | 0 |
| 1 | 1 |

This yields only 16 possible output columns:
0000 - does not vary with input
0001 - AND
0010 - not commutative
0011 - reacts only to A
0100 - not commutative
0101 - reacts only to B
0110 - XOR
0111 - OR
1000 - NOR
1001 - XNOR
1010 - reacts only to B
1011 - not commutative
1100 - reacts only to A
1101 - not commutative
1110 - NAND
1111 - does not react to input

That makes 6 potentialy desirable operations. The seventh is NOT, which takes only one input.
The not commutative ones could conceivably be put to useful work, but in physical designs the asymmetry is impractical, and you can trivially construct them from other gates if need be. In fact some of the useful ones ar also usually constructed from combinations of the others, and all of them *can* be constructed from combinations of NAND gates.

Re:Logic is Logic (0)

Anonymous Coward | about a year and a half ago | (#41668317)

Gawd - that takes me all the way back to 1982 with TTL and +5vcc - those were the fun days! When you could build something so trivial, but it was current and could be used for something useful! :-)

Re:Logic is Logic (1)

Dishwasha (125561) | about a year and a half ago | (#41668459)

There's actually a really fantastic chart in Chapter 1 pg. 10 [nand2tetris.org] that I magnified and printed out as a quick reference. I also drew the gate pictures at the top so I could more easily interpret the schematics.

Re:Logic is Logic (2)

Canazza (1428553) | about a year and a half ago | (#41666745)

I found when I was at school (oh god, that was over 10 years ago) that there was *alot* of overlap between my classes

We learned basic circuitry in Physics, we learned basic Programming and Physics in Tech Studies (that mainly taught us Electronics, through Breadboards, PICMicros and low level programming), and we learned about 6502 Assembler in Computing Studies (although we were told we were the *last* class to learn assembly in CS that year)

And in each of those three we learned about Combination Logic to varying degrees. While Tech Studies (what you focus on when you want to do Electrical engineering) taught us most, CS, which didn't teach us it until a year after Tech studies did, but was pretty thorough and introduced it the context of computing (and actually taught us Binary Mathematics, which Tech Studies didn't), and Physics, which kind of half-arsedly taught us Binary Maths, and spent about 2 weeks on plugging pre-made circuits together to make a light come on.

So I feel you don't really *need* to know circuit design to get a handle on Combinational Logic, but it can help seeing it in a different context to the one you'll be focusing on.

I was lucky I got exposed to it in the way I did, at school, from age 13 to 18, with Tech Studies doing it first I might add, but dicking around with breadboards is - perhaps - slightly overkill to suggest *everyone* who wants to go into IT be required to do it.

Re:Logic is Logic (4, Informative)

dkf (304284) | about a year and a half ago | (#41666771)

AND, OR, NOT, NAND, NOR, XOR and XNOR are still the 7 basic logic elements that make up all digital electronics and programming.

Actually, real digital circuit design uses rather more elements than that, some of which can't be derived from those ideal elements either. Even excluding the clock generator (a thoroughly analog component in its core) there's still some really strange things you can usefully do with transistors that just won't model as anything simpler; my favorite is the arbitrator, it determines which signal rose (or lowered) first and which is used to connect together parts of a chip that use a common power supply but unsynchronized clocks. Simplistic digital theory says it can never work, but in reality it's very effective (and it depends on the fact that transistors are analog devices with some quantum mechanical behavior for disambiguating in the tricky cases. Mad, fun, mad fun!)

Re:Logic is Logic (0)

Anonymous Coward | about a year and a half ago | (#41669035)

You can model the arbitrator with gates. You just can't build them from that model. Simplistic digital theory says they work fine; it's the analog domain that screws things up so the design must be fixed in the analog domain.

Re:Logic is Logic (1)

WarrenLong (540264) | about a year and a half ago | (#41672467)

Hmmm... I think I just wire the output of an inverter back to its input and I get an oscillator... Add more in a chain if I want a bigger delay, or a counter/divider... No analog circuitry required, just the fact that there is a propagation delay. :-)

Re:Logic is Logic (1)

mcgrew (92797) | about a year and a half ago | (#41668573)

Agreed completely. For years they seem to have completely removed the concepts of the hardware itself from programming. I've alwasy thought this was a big mistake. Out of all the programming books I've read, the book that helped the most wasn't a book on programming, but the TTL Cookbook. I don't know if it's still in print, probably not, but it was excellent.

Re:Logic is Logic (1)

skids (119237) | about a year and a half ago | (#41669059)

I've alwasy thought this was a big mistake.

It's a huge mistake. It's why we have programmers that don't understand the computational expense of memory/cache, bounce buffers, and parasitic intermediate representations. Without a crictical mass of such programmers, there is not enough demand pressure for toolkits that work efficiently, and the bells and whistles are chased after instead, no matter how slow said bells and whistles make the system run. Worse yet, it's why useful paradigms become a hammer seeking a nail, as CS students don't think critically when they learn such subjects, about where and when these techniques have serious performance drawbacks, so they mis-apply them where they do not fit.

The few who do master the art are the only people capable of writing competent compilers, but they mostly go on to do other stuff, because there's little mass demand for efficient compilers among their peers. We are very fortunate to have the ones that feel compelled to work on gcc/egcs/llvm/etc in their spare time.

Re:Logic is Logic (1)

Cinder6 (894572) | about a year and a half ago | (#41673137)

Well, take heart that at some places it's still taught. A class on logic circuits was required for a CS degree where I went. They called it "Computer Hardware", admittedly a rather vague name. But we started at the gate level and built (well, simulated) a computer using circuits we designed. At the end of the semester, it had to be able to run arbitrary instructions, and we had to be able to hand-assemble code for it (I got tired of doing this and wrote an assembler). It wasn't as detailed as nand2tetris, but at least it was taught.

Sure, working with actual, physical circuits may have been better/cooler, but I felt like the class was very useful and instructive. I may have been the only one, though, judging by how many people had a very hard time of it.

Re:Logic is Logic (3, Informative)

tlhIngan (30335) | about a year and a half ago | (#41669607)

AND, OR, NOT, NAND, NOR, XOR and XNOR are still the 7 basic logic elements that make up all digital electronics and programming

Of which, NAND and NOR are the primitives - you can constract any gate (and thus truth table result) you want out of purely NAND or purely NOR gates.

Why you pick one over the other is down to limitations of CMOS - PMOS transistors have to be much larger than NMOS ones to be as fast. NAND puts the fast NMOS transistors in series giving you much faster switching than if the PMOS transistors were in series (as it would be in a NOR gate)

Re:Logic is Logic (1)

Anonymous Coward | about a year and a half ago | (#41669789)

I also learned on the 7400 series, and still have massive piles of them around, even newer, faster SMD ones. Although while that provides a good educational foundation, I wonder how practical it is directly though. I'll throw together a circuit, that when made with SMD is the size of a credit card, and someone will ask why I didn't just use an fpga or mcu... I then realize I could have made the circuit the size of a postage stamp, and have it be flexible enough to change the logic after assembly. As much appeal as there are for chips of simple logic gates, turning circuits into essentially playing with Legos, I've found it much better in just about every way to use a programmable chip of some sort for 99% of digital logic circuits for some years now, despite the feeling that it is "overkill," for something that could be done with a 555... or a few dozen 555s...

Re:At what level to teach? (1)

tonyt3 (1014391) | about a year and a half ago | (#41669875)

Sometimes we need to step back a few paces to ask exactly what it is that we're trying to accomplish. When a teen-ager is ready to learn to drive, it is not essential for him/her to be able to rebuild the engine or put in new piston rings. After a person has an introductory-level understanding of what an automobile does, then he/she may want to explore the details in greater depth. Some people may actually turn out to be more interested in painting the car in different colors, or using it to transport sick people to the hospital. We need to be careful, I suspect, not to overwhelm the new computer student with a lot more than the student has to know _at that stage in his development_ ... there needs to be a few (well supervised) joy rides in there to keep up interest. To be sure, the new student does not know enough to know " what he needs to know." But the level of study should be an outgrowth of the person's real interests.

I don't know who your god is... (1)

Anonymous Coward | about a year and a half ago | (#41666559)

... but my god gave me nor gates and an endless world of blocks.

Re:I don't know who your god is... (-1)

Anonymous Coward | about a year and a half ago | (#41666699)

Only idiots have "gods". They need them, as an excuse to be ignorant about the parts of reality, they are too stupid to comprehend.

Re:I don't know who your god is... (1)

Anonymous Coward | about a year and a half ago | (#41667033)

Have you *ever* looked at the Bukkit plugin code?

Ignorance is bliss my friend

Re:I don't know who your god is... (2)

expatriot (903070) | about a year and a half ago | (#41667361)

All Boolean logic can be reduced to multiple NAND gates (or NOT AND operations on two inputs), NOR, OR, AND, XOR, NOT, XNOR can be created from NAND gates. Of course classic Boolean logic does not consider three-state or time-bound signals so it is incomplete regarding actual circuitry.

Re:I don't know who your god is... (0)

Anonymous Coward | about a year and a half ago | (#41667457)

Whoosh.

NORs work just as well as NANDs.

Re:I don't know who your god is... (0)

Anonymous Coward | about a year and a half ago | (#41668381)

Whoosh.

NORs work just as well as NANDs.

The reason for that is that NOR and NAND are dual [wikipedia.org] .

So many great courses around now (3, Informative)

wdef (1050680) | about a year and a half ago | (#41666583)

Looks great, much like I imagined studying Comp Sci ought to be. Ok one can get the book and use the materials for self-learning, but is there a list of institutions using the course for credit?

So many great courses and great teachers around now. Pity they didn't get all this together way back in my day. I've just been working my way through http://michaelnielsen.org/blog/quantum-computing-for-the-determined/ [michaelnielsen.org] and am astonished at the simplicity and lucidity of Nielsen's teaching.

Re:So many great courses around now (3, Interesting)

donscarletti (569232) | about a year and a half ago | (#41667053)

When I studied Comp Sci in the early 00s, we had a compulsary couse on digital circuits, ground up sort of stuff, nand gates, verilog, that sort of thing. If you didn't have a course like that, it is regretable.

My proudest moment is my 80 something year old grandfather, who's own father had built radios for a living and who's brother is a retired electrical engineer saw my textbook and grilled me about solid state switching. He said he did not understand how a signal could be selected based on another signal without the use of electromechanical relays. He knew roughly how a transistor works and I explained how they could be combined into AND, OR and NOT gates. From there, I drew a circuit digram of a multiplexer and to him it was like some great realization that there was no perversion of God's laws going on inside a CPU (joke).

He bought his first PC and Digital Camera within a month.

Re:So many great courses around now (2)

mcgrew (92797) | about a year and a half ago | (#41669337)

He said he did not understand how a signal could be selected based on another signal without the use of electromechanical relays. He knew roughly how a transistor works and I explained how they could be combined into AND, OR and NOT gates.

Something sounds funny there. Transistors play the same role as vaccuum tubes, which were around before your grandpa was born. The first electronic computers used tubes. His brother was an electrical engineer, was your grandfather one as well or just a hobbyist? How good was he?

BTW, it's whose, not who's. Who's is a contraction for "who is".

Re:So many great courses around now (1)

donscarletti (569232) | about a year and a half ago | (#41677683)

No, the first computational devices were electromechanical punchcard databases, they used relays throughout, as did the automatic switching systems in the phone network. Tubes we only used as amplifiers when my grandfather was a young man, computers like Colossus and Eniac that did in fact use vaccuum tubes for switching were largely either obscure or classified until my grandfather was in his mid 30s and too busy with his legal practice, family and music to much care.

Re:So many great courses around now (1)

ShakaUVM (157947) | about a year and a half ago | (#41673815)

>Looks great, much like I imagined studying Comp Sci ought to be.

This was actually what I got out of 4 years of Computer Science at my university (UC San Diego) - an understanding of everything that was going on from the moment I compiled my code and ran it, down to the transistor and logic gate level. We had to write our own compilers, create our own CPUs (including our own assembly language for it), and so forth. Very valuable stuff - at a certain point you realize all the grey areas about what is going on behind the scenes are being filled in.

The really amazing thing, to me, though, is that it all works.

Today... (0)

Anonymous Coward | about a year and a half ago | (#41666599)

Well the thing is even today and decent university will require you to take at least 2 digital system designing courses for a CS degree.

Re:Today... (1)

jythie (914043) | about a year and a half ago | (#41667953)

It varries.. not only with quality of university but with time. There is a constant back and forth with schools trying to teach fundementals vs 'industry ready' coursework depending on what the people in charge feel that the market, at any given time, wants.

Love this course! (2)

euroq (1818100) | about a year and a half ago | (#41666657)

I love this! I am a hardcore developer who's done assembly to Java. I have many non-technical friends who ask, "how does a computer work?" The short answer is that two electrical pulses, which we call either 0 or 1, go through something (a gate like NAND) to get an output of 0 or 1, and you combine that in a massive logic puzzle to get a computer. This course describes everything in detail. Love it. Well, not "everything" but certainly everything non-educated but technical people want to know.

Starting at logic gates seems arbitrary. (1)

Brannon (221550) | about a year and a half ago | (#41669319)

I think it makes more sense to start with the "transistor is a switch" level of abstraction. Make some logic gates out of transistors and then go from there. Most people have heard that "computers are made out of transistors" and have probably always wondered where exactly they fit in.

One could try to start even lower, and introduce some basic semiconductor physics--but I'm not sure of a clean of way of introducing those concepts without lying a lot.

Not so sure this will work... (2, Funny)

Foske (144771) | about a year and a half ago | (#41666665)

Quite tough to align multiple NAND gates without open spaces between them. With one circular and three flat faces it doesn't fit well. Oww and that annoying circle for the inverter... Happy that I live in Europe: Though I dislike our (square) logic gate symbols, they are great for tetris... http://en.wikipedia.org/wiki/Logic_gate#Symbols [wikipedia.org] .

See: we Europeans beat the USA even with logic gate-tetris !

Bottom Up Approach (1, Interesting)

Anonymous Coward | about a year and a half ago | (#41666689)

I agree with the bottom up approach to learning programming & CS.

I started off learning BASIC, but it was very slow going. As soon as I got into 8086 Assembly everything clicked into place. After making a few simple games, I built my own Boot loaders and toy OSs, Languages, etc. without any teaching or courses needed. I remember thinking, "I wonder how you make a bootable disk?", Turns out everything I needed to know was in the BIOS documentation: Interrupt 19 [ctyme.com]

I agree that knowing about the circuitry was neat, and necessary for completeness, but it's by no means essential for CS, IMO. Learning the electronics didn't really help me much because at the instruction level even things like CALL are abstractions for more logic... Maybe if I had been learning on RISC hardware, or had been content to stay in some VM Sandbox I'd have a different view?

The difference between the NAND to Tetris course, and taking a crash course on (x86 | ARM) Assembly + RTFM for your BIOS and CPU is in the end, my programs actually worked on Real hardware. That makes my efforts far more useful immediately and far more relevant. For instance, I built disk partition tools and undelete utilities and actually sold them on a BBS I built myself, and I still use them to this day. I've got a .ISO with my OS and disk tools, and some games that I can put on Floppies, HDs, USB drives, CDs / DVDs, etc. and boot them with any x86 or x64 system.

I understand the course is for Universities, and so contains a certain level of depth, but IMO, after you study how NANDs can perform computations they should switch to ARM or x86 assembly using Real hardware -- A Raspberry PI costs less than their damn physical book!

Furthermore, NANDs alone do not a computer make. That's just WRONG. You need something to pulse the logic gates, like a Capacitor. If you want it to be actually useful then it'll need I/O: You'll need some buttons and a display. If you're not going to teach CS then why focus on the irrelevant underlying circuitry to such a large degree? When I was 18 I actually wired transistors, capacitors, resistors, LEDs and button switches together to "Program" an actual working Tetris game -- Didn't require a VM to run, and my friends were far more impressed than they would have been with some program inside an existing computer.

My point is: Don't preach building computers from scratch unless you're actually going to enable folks to do just that in real life. If it's a CS course then utilize an actual chipset. IMHO, Save time & download (QEMU | VMWare) + (GNU Assembler | NASM) then start building your own bootstrap loader that will run on real hardware. [osdev.org]

Re:Bottom Up Approach (4, Insightful)

Anonymous Coward | about a year and a half ago | (#41666837)

Your reply just made Donald Knuth cry.

Sometimes you need to learn a generic and simplified technology before you can comprehend the incredibly complex and optimized real world examples. And sometimes real world examples are so narrowly designed that you would lose out on a general understanding of computing by focusing on that one design. Finally, sometimes real world examples carry the baggage of the past which can waste valuable time.

Re:Bottom Up Approach (1)

Rockoon (1252108) | about a year and a half ago | (#41667071)

...you are talking about valuable time, but seem to be suggesting that he learn MMIX? Really?

Re:Bottom Up Approach (1)

vlm (69642) | about a year and a half ago | (#41667341)

...you are talking about valuable time, but seem to be suggesting that he learn MMIX? Really?

TAOCP is not a MMIX textbook... Wait till you learn what else is in the books.... Learning MMIX was about 0.001% of the way....

Not necessarily bottom up (1)

betterunixthanunix (980855) | about a year and a half ago | (#41669045)

I agree with the bottom up approach to learning programming & CS.

This is not necessarily a "bottom up" approach. Digital logic winds up being pretty important to know for writing very high level programs -- secure multiparty computation, model checking and formal verification systems, automated theorem proving, and so forth. This is not a stack, it is a cycle.

Computer science is magic (1)

martyros (588782) | about a year and a half ago | (#41666713)

I haven't taken this particular course, but the "Introduction to Computer Design" course at my university, where we started with AND and OR gates, and ended by building a simple microprocessor, was definitely one of my favorites. It definitely had the feeling of magic: you figure out what you want to do, put together a bunch of random bits of logic, draw a box around it, and suddenly you've got an adder or an instruction decoder. I still feel that way whenever I write a really new bit of funcitonality.

Cue the "real programmers' jokes (3, Insightful)

Artifakt (700173) | about a year and a half ago | (#41666725)

I'm not saying its a good idea to develop an elitist attitude towards the people that use them, but this explains why there's some rational basis for looking down on scripting languages. It's not that they are inherently bad or that the people who use them lack the ability to do 'real programming'. But, they are basically all about not having to know anything at all about how the other layers of abstraction work, and a consequence is they also don't give the programmer any real connection to how the hardware layer works and how you get from it to what they know.
            For example, if you know how an OS is generally compiled in a language such as C or C++, then the next step is understanding that the compiler is itself running 'on a level above' assembly language. Understand that, and its a straightforward conclusion that a program can always be written in assembly that bypasses ANY controls the OS has about accessing different parts of memory, doing file copying, assigning user and admin permissions, and similar things. That program may be much less portable than something written in Perl, but it's inherently very powerful at what it does. It's not that people who program in assembly are necessarily any smarter or better at it than people who write Python. That's certainly debatable. The thing that isn't debatable is that the closer a programmer gets to machine language, the more they can do that nothing higher in the heirarchy can stop, position itself against, or even detect. At some point, that means trying to secure scripted code, or compiled code, or anything above assembly is like trying to defend a point with what may be a perfectly good machine gun, but the other side is the only one with stealthed, antimatter pumped, orbital X-ray laser arrays. They can have sloppy aim, lack elegance and inspiration, and still win.
            Nowdays, there are plenty of people working with a modern OS, even one that is still all compiled at just one level above assembly (if there are any real systems that you want to count as modern that still fit that, what with silverlight, dotnet, flash and so on on just about every machine out there), who don't understand the heirarchial nature of coding worth a damn. It seems to get worse as you get to people writng applications for the various OSes. Some of these people are very good coders (or scripters, or whatever), but they really just can't write secure apps, because they don't really understand what the difference between a script kiddee attacker and a threat whole governments wish they could get on their side really is.
          That's just one of the things this course and others like it are supposed to fix. A lot of us need this. Hell, I've known this stuff for 35-40 years, and this reminds me I should get out the old books and do a little refresher. If you've read things about coding becoming as professional as aero-space engineering or similar, and found yourself agreeing with any of them, this is where it starts.

Re:Cue the "real programmers' jokes (-1)

Anonymous Coward | about a year and a half ago | (#41666783)

I just fucking shit my fuck!

Ba-dum-tsh!

Re:Cue the "real programmers' jokes (1)

Anonymous Coward | about a year and a half ago | (#41666883)

its a straightforward conclusion that a program can always be written in assembly that bypasses ANY controls the OS has about accessing different parts of memory, doing file copying, assigning user and admin permissions, and similar things.

I think you fail to understand how the OS controls these permissions. The operating system sets the level of protection in the hardware and if a program violates the protection without the appropriate hardware level privilege level, the microprocessor will trigger a fault which will transfer control back to the OS which will then terminate the program (GPF). The only way to change the privilege level is through an operating system call.

Re:Cue the "real programmers' jokes (2)

mcgrew (92797) | about a year and a half ago | (#41670127)

I wrote a program twenty years ago that would reboot your computer. The program was four bytes long, its name took more disk space than its code. Of course, I didn't use an assembler, just DOS Debug.

The closer you get to the bare wires, the more damage you can do.

Re:Cue the "real programmers' jokes (0)

Anonymous Coward | about a year and a half ago | (#41676353)

I did the same by typing "from hells heart I stab at thee" into Notepad. After saving it over KERNEL32.DLL, I rebooted and the system was right hacked off.

Re:Cue the "real programmers' jokes (0)

Anonymous Coward | about a year and a half ago | (#41671361)

The only way to change the privilege level is through an operating system call.

Unless you can find a hardware error (rare as rocking horse shit), or a virtual machine error (hen's teeth, but still easier), or an OS error (happens all the goddamned time).

Re:Cue the "real programmers' jokes (4, Insightful)

Rockoon (1252108) | about a year and a half ago | (#41667097)

I'm not saying its a good idea to develop an elitist attitude towards the people that use them, but this explains why there's some rational basis for looking down on scripting languages. It's not that they are inherently bad or that the people who use them lack the ability to do 'real programming'. But, they are basically all about not having to know anything at all about how the other layers of abstraction work, and a consequence is they also don't give the programmer any real connection to how the hardware layer works and how you get from it to what they know.

The same argument could be used against C++, or C, and not just scripting languages like you claim. I know that most C programmers think they are doing low level programming, but they aren't.

For example, if you know how an OS is generally compiled in a language such as C or C++, then the next step is understanding that the compiler is itself running 'on a level above' assembly language. Understand that, and its a straightforward conclusion that a program can always be written in assembly that bypasses ANY controls the OS has about accessing different parts of memory, doing file copying, assigning user and admin permissions, and similar things.

Umm, no. Just no. I have a great idea.. when you don't know what you are talking about, don't fucking talk. We both know that you don't know what you are talking about, which leads to the conclusion that you like to pretend to know what you are talking about... in short, you are a dishonest fuck.

Re:Cue the "real programmers' jokes (1)

Arker (91948) | about a year and a half ago | (#41668985)

The same argument could be used against C++, or C, and not just scripting languages like you claim. I know that most C programmers think they are doing low level programming, but they aren't.

I'm a little confused by your plus 5 mod, I usually get flambaited into oblivion when I say so, but I was taught that C, along with Pascal, pretty much embodied 'high level programming language' and anything more abstract isnt programming, it is scripting. Assembler is still quite abstract. Low level programming means hexadecimal.

Re:Cue the "real programmers' jokes (0)

Anonymous Coward | about a year and a half ago | (#41669613)

Hex? Binary - flipping toggles on 12-bit microprocessor (DEC PDP-8e). Kids these days. Get off of my lawn!
(Or for the pansies, the PDP-8e's 12-bit byte could be represented by 4 OCTAL digits.)

Re:Cue the "real programmers' jokes (1)

mk1004 (2488060) | about a year and a half ago | (#41671275)

The newbies, of course, will have to check wikipedia to find out about octal. Dang kids!

Re:Cue the "real programmers' jokes (1)

Arker (91948) | about a year and a half ago | (#41672177)

Hex and octal are easily converted by hand into binary and back, so I dont see too much difference there. It was the macro in the macro assemblers that first justified the 'high level' moniker.

Re:Cue the "real programmers' jokes (1)

Rockoon (1252108) | about a year and a half ago | (#41673989)

I'm a little confused by your plus 5 mod, I usually get flambaited into oblivion when I say so, but I was taught that C, along with Pascal, pretty much embodied 'high level programming language' and anything more abstract isnt programming, it is scripting.

I too was taught that C was a high level programming language. However, over the years, the C zealots seem to have been successful at redefining what a low level programming language is to include their pet favorite language. I was not taught that other languages were scripting. I was taught that there were assemblers, compilers, and interpreters.

C isnt exclusive to compilers. There are plenty of C interpreters.
Java's byte code, on the right architecture, is a full fledged assembly language.
JavaSCRIPT of all things, is now most often compiled before execution.

This idea that 'thats scripting' and 'thats not scripting' is distracting you from the underlying reality.

Assembler is still quite abstract. Low level programming means hexadecimal.

The differences between assembler and microcode are only in the syntactic and the utility. Assembler is abstract, but its not based on an abstract machine. C is always (so far) based on an entirely abstract machine.

Re:Cue the "real programmers' jokes (1)

Arker (91948) | about a year and a half ago | (#41675791)

The difference has nothing to do with compiled versus interpreted, that is a common misconception because early scripting languages were more often than not interpreted, but it's never been a hard and fast rule. Scripting is stringing together pre-existing tools with i/o and control logic. You can probably script in any language if you want to but some languages are written specifically for it. It's not a bad thing, it's very often the quickest most efficient way to deal with things.

Re:Cue the "real programmers' jokes (1)

wdef (1050680) | about a year and a half ago | (#41678533)

Most famously, Perl people can get their knickers in a knot about calling a Perl program a "script", which is felt to be diminishing (rationally or not), since Perl is compiled to bytecode (depending on how it runs) and thus is not an interpreted or so-called "scripting" language. It can be autoconverted to C (not sure how good that is) and it can be compiled to binaries.

Re:Cue the "real programmers' jokes (0)

Anonymous Coward | about a year and a half ago | (#41669665)

I know that most C programmers think they are doing low level programming, but they aren't.

As someone who has done a lot of assembly and embedded work on systems going back to days of yore, I still view C as pretty low level. I can look at most lines of C, and short of abuse of macros, have a good idea what machine instructions would result. With C++, a lot of bits and pieces, I can see what machine instructions would come out, but especially when you get to heavy use of templates and metaprogramming, that disappears. By the time of python, I have little sense of connection to the machine instructions, even though I've worked with bits of python's back end before and have some idea of what parts of it are doing.

C doesn't require you know everything the cpu is doing, but that depends on what you are using it for. In some cases you still end up with functions that are really just a way of throwing some machine instruction in code directly and can get yourself into trouble if you use a machine with a different representation of signed integers...

Re:Cue the "real programmers' jokes (1)

Rockoon (1252108) | about a year and a half ago | (#41674181)

As someone who has done a lot of assembly and embedded work on systems going back to days of yore, I still view C as pretty low level. I can look at most lines of C, and short of abuse of macros, have a good idea what machine instructions would result.

I can look at VB6 code and have a good idea what machine instructions will result. This line of reasoning is obviously not a testament to how low level a language is. Its instead a testament to your familiarity with assembly language.

My test is far more pragmatic.. I can of course coerce the compiler (without ASM blocks) to emit some specific instructions on demand, but are some of the instructions that I cannot coerce out of the compiler generally useful? Lets consider your favorite x86 C compiler.. can you coerce it to emit rotate through carry instructions (RCL and RCR) ?

The flags register is a fundamental part of the x86 (and many other) architectures, yet the C language has no concept at all about a flags register, to the point where you cannot even coerce many useful flag related instructions. Not low level.

Re:Cue the "real programmers' jokes (0)

Anonymous Coward | about a year and a half ago | (#41675669)

You can get C code to use rotate through carry instructions, whether hidden behind some function or just done with inline assembly. Can you intermix assembly and VB6 code though?

Someone else mentions they think assembly is high level, that low level means using machine code. To me, there is a difference between an abstraction and a feature that is convenience. Simple assembly code is identical to programming in machine language, just with easier to read symbols. On top of that, you can easily add a macro system to save yourself some of the tedium of writing out simple patterns. Add type checking, and you are at about the simple use of C code. C offers some abstractions, many of which are just hidden behind function calls that can then use system specific implementation. Although if you are ignoring portability and make all the assumptions about a particular system, and end up with something that is not much more than an easier to read version of the assembly instructions. I haven't had any problems doing so, especially on embedded systems and microcontrollers, where there is no OS to handle the system stuff, and you need access to every possible instruction in the set.

Re:Cue the "real programmers' jokes (0)

Anonymous Coward | about a year and a half ago | (#41671487)

Umm, no. Just no. I have a great idea.. when you don't know what you are talking about, don't fucking talk. We both know that you don't know what you are talking about, which leads to the conclusion that you like to pretend to know what you are talking about... in short, you are a dishonest fuck.

I wouldn't go into this with the assumption that everybody is a dishonest fuck when ignorance is a vastly simpler explanation. At a very crude approximation the notion that there are layers of abstraction within a computer language is really the concept that matters, and at the very least provides a foundation to correct mistakes with.

We could also continue to educate by vitriol; I'll give it a (Score:5, Inciteful)

Re:Cue the "real programmers' jokes (1)

Rockoon (1252108) | about a year and a half ago | (#41674211)

I wouldn't go into this with the assumption that everybody is a dishonest fuck when ignorance is a vastly simpler explanation.

I didn't call him a dishonest fuck because he was ignorant. I called him a dishonest fuck because there is no way that he could not have known that he was ignorant.

Re:Cue the "real programmers' jokes (0)

Anonymous Coward | about a year and a half ago | (#41667655)

This is one heck of a post. Ten years ago, I probably would have written an angry point-by-point counterargument and called you names. Five years ago, I probably would have assumed that you were unaware of why your post made people so angry and explained why in a long and kindly worded response.

Today, I think I will just take a break to appreciate the clarity with which this post expresses a certain archetypal view of programming. I guess that means I think you're trolling. It is spectacular, whatever your intent.

Re:Cue the "real programmers' jokes (1)

dkleinsc (563838) | about a year and a half ago | (#41667945)

Two objections:
1. The programmers who use scripting languages extensively often understand lower-level code, even down to the machine code, but choose not to use it because it creates a whole bunch of unnecessary headaches.
2. Well-written scripting languages ensure that the lower-level layer is not vulnerable to the most likely forms of attack, like buffer overflows. That means that the lower-level attack doesn't work, so in your scenario you might have a good machine gun and a roof to protect your position from orbital X-ray laser arrays.

Re:Cue the "real programmers' jokes (1)

Artifakt (700173) | about a year and a half ago | (#41669989)

On 1. it's not about choosing to use a certain language. Yes there are plenty of valid reasons for choosing a scripting language for certain tasks. I tend to disagree that anyone who has only learned scripting languages is all that likely to have learned enough about the progression from machine language to whatever interprets their script. I don't think it happens enough to use that word 'often' the way you are using it. I don't even think that the people programming in classical compiled languages such as C tend to be exposed to hardware and machine language concepts enough to use 'often' analogously. Maybe I'm wrong in thinking the problem gets worse with people who don't know a compiled language, but even if someone who only works in Javascript is equally likely to know what an assembler does as is someone coding C#, well, not enough people coding C# or C++ or good ole C know enough about assemblers either.

On 2. I'm not clear just how you mean that. Are you arguing that the application code itself can be written to block the most likely lower level attacks, or that it's normal for scripts written up to standard to actually succeed at blocking them, or that the various interpreters themselves are written to block at least the most likely lower level attacks?
I'm also unsure if you're including some of the dynamic languages that frequently extend beyond just scripting modes to get your "well written scripting languages". Are we talking TCL, Perl, and Ruby here, or Javascript, PHP, and Microsoft ASP? How much does a Python writer need to know about the interpreter or the OS his or her script or executable will run on? Is being able to use Python as a scripting language, but not properly create executables, imbed Python in C or C++, or write new modules for Python in C, C++ or Cython enough to get good protection? It looks to me like protection from even the most common forms of attack depends largely on the language having dynamic elements that go beyond scripting, and it's the people who know those elements, and so simply have to know something of compiled languages and assembly, that make that good protection possible.
        I won't go as far as to claim that the more a language is 'purely' for scripting, the more it's been prone to security flaws. I don't think we have 100% correlation there - but I do think we have some.

Re:Cue the "real programmers' jokes (1)

dkleinsc (563838) | about a year and a half ago | (#41672309)

My argument on point 1 is that it's hardly unusual for 4-year computer science program to expose students to low-level code (and demand they write some low-level code). You show a developer who mostly writes in Python some C or assembler and there's a good chance they'll understand what they're looking at.

Regarding point 2, the specific phenomenon that I'm going to focus on is classic buffer overflows, because those still are some of the most common forms of attack (see CWE-120 [mitre.org] ). In C, a programmer has to be very careful to manage their malloc's and C arrays (particularly stack-allocated arrays) and pointers in general to ensure that they can't walk off the end of a data structure or point to their own running code or stuff 1100 bytes into a 1000-byte spot in memory. In, say, Python, because the interpreter has had a lot of checking for this, and because the structures in C that allow these kind of vulnerabilities aren't available directly, they're a non-issue. The scripting language interpreter / compiler solves that problem once for you, and then you don't have to solve it again. And the effect is that higher-level applications tend to be more vulnerable to application-layer mistakes like SQL injection than they are buffer overflows.

Re:Cue the "real programmers' jokes (1)

DMUTPeregrine (612791) | about a year and a half ago | (#41674967)

The only programming language I know is Python.

But I also know Verilog, and use it a good bit more than the Python.

If I could have an HDL that acted like Python, I'd love it. Languages (HDL and Programming) are tools, and the high-level ones tend to be better tools. The low-level ones tend to be good for learning, but there's a lot to be said for the effort saved by using a high-level language.

Re:Cue the "real programmers' jokes (1)

ByteSlicer (735276) | about a year and a half ago | (#41669351)

a program can always be written in assembly that bypasses ANY controls the OS has about accessing different parts of memory, doing file copying, assigning user and admin permissions, and similar things.

You should read up about things like protected mode (30 years old), sandboxing, hypervisors, ... . All used to make sure your application doesn't just read/write whatever memory/ports it wants.
And just because a compiler translates C to assembly, it doesn't mean you have full control over the generated assembly by tweaking the C-code (that's why there is inline assembly).

The real problem is that the average Joe user doesn't want to be inconvenienced by all this security, he just wants to be able to run that dancing monkey screen saver, and doesn't care that running it with his access rights gives the thing access to his whole disk and the internet. And he will happily click Allow when asked to install a trojan dancing naked girl.

I am very proud today... (2)

jkrise (535370) | about a year and a half ago | (#41666737)

This speech at TED was featured in my local Linux User Group based in South India a few weeks back. I am planning on conducting this course for my College students in CS and IT branches.

The video also features Pramode CE who runs a consultancy for Embedded Systems and Software in nearby Trichur in Kerala State.

Very nice course, one I fully recommend for all ages.

Wha? (1)

rikkards (98006) | about a year and a half ago | (#41666989)

Anyone else first read that as NAND2Tardis?
Need coffee

Re:Wha? (1)

Half-pint HAL (718102) | about a year and a half ago | (#41667111)

Any ful kno the TARDIS is analogue....

Re:Wha? (1)

pugugly (152978) | about a year and a half ago | (#41667377)

No, the TARDIS actually does the low level operations in digitally manipulated planck time units

The analog interface does have a discrepancy of +- 10^102nd, but this has no practical effect on the operations and can be ignored by the end user.

I worked through this entire book (0)

Anonymous Coward | about a year and a half ago | (#41667203)

I did all the projects carefully, and the book blew my mind. It is today my favorite textbook in any subject, and I don't even focus on computer science. It was so enlightening and so fun. I know how to make a computer now.

One of the best courses I've ever taken (2, Informative)

Anonymous Coward | about a year and a half ago | (#41667261)

This was given as a course in my University (the Hebrew University in Jerusalem) eight years ago in which the lecturer was the other creator (Prof. Noam Nissan).
I think the beauty of it is the fact that you manage to understand the principles of each level of the architecture without going into deep depth of each one, and so it manages to remain interesting throughout the course.
At the end of the course, we ended up writing xonix (which involved writing a non-recursive flood-fill algorithm).
Some team even wrote Socoban.

MIT has courses like this (0)

Anonymous Coward | about a year and a half ago | (#41667295)

Although MIT started at the transistor level when I took it, and it was two courses. The only big drawback was their insistence on spending time on LISP, which was very exciting to computer scientists and taught *horrible* habits of excessive recursion and drawing "levels of abstraction" related to modern "object oriented" practices which continue to cripple computer performance worldwide.

Where can I Teach this course? (1)

johnjaydk (584895) | about a year and a half ago | (#41667467)

I'm heavily into teaching and computer science so I'm always looking into this kind of stuff. I came across this book and worked my way through all the exercises.

I love this book/course so much that I'm looking for a place that'll allow me to teach this course. Salary not required. But it's all dot-net and stuff around here. Lamers.

The more things change (0)

Anonymous Coward | about a year and a half ago | (#41667583)

After 50 years messing with computers, including being there when the Illiac sang Daisy..., I am of the view that some problems that were present day one are still with us -- despite thousands of name changes, redefinitions and similar marketing hype. And too many repetitions of 'oh, thats obsolete... you really should be using...'. And some salesfolks will tell your boss that about you when you are not cooperating with their personal profit plan. Anyhow, since day one through until today everything has to be stored someplace, and all sharing schemes are really just slight of hand -- computers don't share anything (but have gotten very good at pretending). Nice to see timesharing coming back -- although I believe that the term 'cloud' probably best applies to the marketing around it. Besides, its pretty meaningless if local connectivity is poor. And personally, I am not so sure that the thousands of programing 'languages' have done much good beyond pushing people out of the industry and making it easier for HR to hire the one with the .... Has certainly made it harder to learn from experience. But it is nice to see someone talking about the importance of understanding how it all comes together -- a pity it seems to be a minority view.

You don't start with the PNP junction? (1)

Anonymous Coward | about a year and a half ago | (#41667685)

The NAND gate is a pretty high level abstraction of the quantum mechanics taking place within the semiconductor junctions. Maybe start with vacuum tubes, since those are a much simpler implementation of NAND...

Re:You don't start with the PNP junction? (1)

Teancum (67324) | about a year and a half ago | (#41671601)

Why start with vacuum tubes when you can go all the way back to Leyden jar [wikipedia.org] batteries and spinning copper wires using hand-made tools and raw ore you dug out of the ground. I suppose you also blow your own silla-based glass too?

Re:You don't start with the PNP junction? (1)

Hatta (162192) | about a year and a half ago | (#41675567)

NAND is more fundamental than transistors. You can build a computer without transistors, as you note with vacuum tubes. You cannot build a computer without using NAND. As he notes in the video, it's a CS course, not EE or physics.

Andrew Tanenbaum (1)

Coop (9778) | about a year and a half ago | (#41667767)

Tanenbaum's [wikipedia.org] Structured Computer Organization takes a similar approach, going from the boolean logic of a transistor gate up to the OS and application level. I took a class with the first edition of Tanenbaum's book as the text in 1983 and l learned more about computers from it than from any other class before or since.

Great course (2)

Dishwasha (125561) | about a year and a half ago | (#41668349)

Coincidentally I have been running through this course in my spare time and I have to say it is the best I have found in 10 years. I've been itching to build a homebrew cpu like http://www.homebrewcpu.com/ [homebrewcpu.com] but lacked the basic skills to design a proper ALU and such. Most other courses either start way too basic and then shoot too far forward or they gloss over the basics and go right in to advanced concepts. So far I have made it through Chapter 2 and I'm proud to say that I've built all the basic components in HDL without looking anything up outside of the course material. Being able to build complex components on top of basic components I built myself is very rewarding. This is a must take course if you want a more intimate understanding of how computers work. And if building a computer from basic gates isn't nerdy enough for you, build your own [youtube.com] transistors [youtube.com] .

Tetris v. Xio (1)

tepples (727027) | about a year and a half ago | (#41670173)

What would Alexey Pajitnov and Henk Rogers think about this? Their company successfully sued the publisher of an unlicensed Tetris clone for copyright infringement a few months ago.

Exaggerated comment is exaggerated. (1)

CodingHero (1545185) | about a year and a half ago | (#41670249)

. . .the only way to really understand what computers are all about.

Really? The ONLY way?

With a name like Schocken (0)

Anonymous Coward | about a year and a half ago | (#41671459)

With a name like Schocken, it *HAS* to be an excellent electronics course.

Nyark nyark nyark.

There's some bad physics going on here (1)

minkie (814488) | about a year and a half ago | (#41672933)

I looked at the "Getting Started With Digital Logic - Logic Gates" part. Anybody who has actually built something with TTL on a breadboard should know that 7400 series gates can sink a lot more current than they can source. Connecting a logic output to ground through a LED may not draw enough current to turn the LED on fully. The right way to do it is to connect the LED between the logic output and the Vcc rail in a pull-down configuration (with a current limiting resistor). Of course, that gives you inverted logic (LED on means logic 0, LED off means logic 1), but you get used to that. If it bothers you, use an inverter.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...