Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

60 Years of Hamming Codes

timothy posted more than 3 years ago | from the hamming-it-up dept.

Communications 66

swandives writes "In 1950 Bell Labs researcher Richard W. Hamming made a discovery that would lay an important foundation for the modern computing and communications industries — coming up with a method for performing computing operations on a large scale without errors. Hamming wrote about how self-checking circuits help eliminate errors in telephone central offices. He speculated the 'special codes' he proposed — which became known as Hamming codes — would only need to be applied to systems requiring unattended operation for long periods or 'extremely large and tightly integrated' systems where a single failure would incapacitate the entire installation. Hamming code was the first discovery in an immense field called coding theory. This article looks back on the history of Hamming codes, their applications, and includes interviews with Todd Moon, Professor of electrical and computer engineering at Utah State University and David MacKay, Professor of natural philosophy in the department of Physics at the University of Cambridge and chief scientific adviser to the UK Department of Energy and Climate Change. An interesting read, about a little-known but fundamental element of information theory."

cancel ×

66 comments

Sorry! There are no comments related to the filter you selected.

News For Nerds (4, Insightful)

DarkKnightRadick (268025) | more than 3 years ago | (#34343552)

News for nerds, stuff that matters.

This submission qualifies.

Re:News For Nerds (0)

Anonymous Coward | more than 3 years ago | (#34343764)

Never underestimate the importance of getting a favorable review. [ufl.edu]

Re:News For Nerds (3, Insightful)

starfishsystems (834319) | more than 3 years ago | (#34343898)

Hah! It's fundamental computer science, and its use is ubiquitous. It's the equivalent of Watson and Crick's discovery of the helical structure of DNA, which we learn about in elementary school.

You want to celebrate the history of ring theory, now that would qualify as nerdly.

Re:News For Nerds (3, Insightful)

DrSkwid (118965) | more than 3 years ago | (#34345726)

Did they tell you Crick was high on LSD at the time [hallucinogens.com] ?

Re:News For Nerds (1)

starfishsystems (834319) | more than 3 years ago | (#34345810)

Evidently then, highly accomplished and rational people can be in favor of hallucinogens. Not that I ever doubted it, but it's always nice to have evidence.

Re:News For Nerds (1)

starfishsystems (834319) | more than 3 years ago | (#34345860)

P.S. Alan Turing was gay, as we know now. We all learned of his enormous contributions to computer science, but when I was an undergrad it was never mentioned that the British government crushed him anyway.

Re:News For Nerds (0)

Anonymous Coward | more than 3 years ago | (#34344156)

News for nerds, stuff that matters.

This submission qualifies.

Not really news since its a look back at the history.

Re:News For Nerds (1)

DarkKnightRadick (268025) | more than 3 years ago | (#34344780)

True, but it is stuff that matters. Sounds as if without Hamming Code (or whatever it would have been called had someone else discovered it) we wouldn't be where we are today.

Re:News For Nerds - Read his book! (4, Informative)

refactored (260886) | more than 3 years ago | (#34345002)

His book Coding and Information Theory [amazon.com] is by far the best written and most readable hard science textbook I ever had in my university career. Read it if you want to understand the subject, read it if you want to understand how to write a good textbook!

Re:News For Nerds - Read his book! (1)

e70838 (976799) | more than 3 years ago | (#34345826)

Information theory was not my option, but I was in Telecom Bretagne when turbocodes [wikipedia.org] were discovered and many theoretic aspects has been demonstrated. Did the turbocodes make the Hamming code obsolete or do I miss something ?

Re:News For Nerds - Read his book! (0)

Anonymous Coward | more than 3 years ago | (#34349634)

The history of cars was not my option but I was in Volkswagen when the Golf was released. Did the Golf make Ford's Model T obsolete or do I miss something?

Re:News For Nerds (1)

decipher_saint (72686) | more than 3 years ago | (#34344212)

Sad thing is that the story has been up for nearly 2 hours and there's less than 20 comments...

Re:News For Nerds (1)

DarkKnightRadick (268025) | more than 3 years ago | (#34344798)

I know. ):

Re:News For Nerds (2, Insightful)

Anonymous Coward | more than 3 years ago | (#34344440)

News for nerds, stuff that matters.

This submission qualifies.

And got 26 comments... while

    Ask Slashdot: Have I Lost My Gaming Mojo?

Got 300 plus...

Nerds are devaluated nowadays or Slashdot got some entirely different demographics
Maybe is time for changing the tagline?

Re:News For Nerds (1)

DarkKnightRadick (268025) | more than 3 years ago | (#34344760)

That's what I'm thinking (both, geeks are devaluated and /.'s demographics have changed).

Re:News For Nerds (1)

syousef (465911) | more than 3 years ago | (#34344778)

It's not news if it's decades old. The author admits in his article that the codes have well and truly been superseded. What it's clear to me is that he's trying to publicise his book (he does link to the free PDF version - so I doubt his only motivation is money). He has reviewed his own book on Amazon twice: "Brings theory to life" and "An exciting and up-to-date text"

http://www.amazon.ca/product-reviews/0521642981/ref=cm_cr_dp_all_helpful?ie=UTF8&showViewpoints=1&sortBy=bySubmissionDateDescending [amazon.ca]

Customer reviews are not as glowing as his own. "Good book - but few arguments need revision from theorists" and "A reservoir of information - Yet few problems"

Re:News For Nerds (-1, Offtopic)

Anonymous Coward | more than 3 years ago | (#34346010)

free teen [freeteenwhores.com]

Little known? (0)

Anonymous Coward | more than 3 years ago | (#34343652)

You're kidding, right?

Re:Little known? (3, Insightful)

$RANDOMLUSER (804576) | more than 3 years ago | (#34343730)

That's right. Had Hamming's discovery been more well known, he might have won the Claude E. Shannon Award [wikipedia.org] .

DUCWIDT?

Re:Little known? (1)

wanax (46819) | more than 3 years ago | (#34347452)

Well, of course, he did win the Richard W. Hamming Medal [wikipedia.org] 16 years later. I think the issue with Hamming is that he was so productive, and at the forefront of the field for so long, that he never settled into the "grand old man" role that tends to attract awards. I have a pet theory that for most researchers, winning big awards is a signal of their decline, because the politics of award committees means that its rare for somebody who still publishes controversial and original research to survive the nomination process. I mean, he wasn't even voted to be an IEEE fellow until 18 years after he published the Hamming code! His career was so successful in so many areas that it took some time for the applied mathematics community that all these really interesting little ideas in their own fields were the result of an avalanche of singular proportions.

Hamming clearly intended to do this, and often contrasted himself with Shannon. He categorized his approach in a talk called "You and Your research" [virginia.edu] delivered at Bell labs in 1986, which I'd highly recommend to any researcher who hasn't seen it.

Anonymous Coward (0)

Anonymous Coward | more than 3 years ago | (#34343668)

" about a little-known but fundamental element of information theory."

little-known? If you know anything about information theory, you know about hamming codes.

Re:Anonymous Coward (2, Informative)

Anonymous Coward | more than 3 years ago | (#34343884)

If you know anything about information theory, you know about hamming codes.
If you know anything about information theory, you know about hamming codes.
If you know anything about information theory, you know about hamming codes.
If you know anything about information theory, you know about hamming codes.
If you know anything about information theory, you know about hamming codes.

Re:Anonymous Coward (0)

Anonymous Coward | more than 3 years ago | (#34346246)

Yes, that's a n=5 repetition code.

Re:Anonymous Coward (0)

Anonymous Coward | more than 3 years ago | (#34348554)

I assume you're assuming it starts at 1, right?

Re:Anonymous Coward (4, Insightful)

Teancum (67324) | more than 3 years ago | (#34343950)

I'm sure a whole bunch of people know about Bill Gates and Steve Jobs, but their impact upon computing and information theory is comparatively minor compared to Richard Hamming. There are many such individuals who if you have studied your history of computing that really ought to be much better known but little is really talked about even in computing circles. Usually there is a theorem or algorithm which bears the name of an individual like Dijkstra's algorithm, but who really knows much about Edsger Dijkstra, the guy who came up with the concept in the first place, or for that matter even knows the names behind the LZW compression algorithm?

If you went to a group of college seniors in computer science, how many of them would have ever heard about Grace Hooper? First classmen in the Naval Academy? (I would sure hope that the U.S. Naval Academy at least would have taught their computer science cadets something about Admiral Hooper, especially if they get assigned to the USS Hooper).

There are a bunch of people you should know in the history of computing, and unless you have a very good professor who doesn't claim to have invented the integrated circuit and every other part of computing, you generally don't know the whys for how most concepts in computer science were ever derived.

Re:Anonymous Coward (0, Informative)

Anonymous Coward | more than 3 years ago | (#34344048)

There's a good reason Grace Hooper isn't well known for her contributions to computing. Of course, the same can't be said of Grace Hopper, who among other things discovered the first bug in a computer system.

Re:Anonymous Coward (1)

ZERO1ZERO (948669) | more than 3 years ago | (#34344330)

Exactly what I was thinking.

Re:Anonymous Coward (0)

Anonymous Coward | more than 3 years ago | (#34344350)

hahahhahahahah Mr Looper!

Re:Anonymous Coward (3, Informative)

joshki (152061) | more than 3 years ago | (#34344402)

There is no Admiral Grace Hooper. There was, however, an Admiral Grace Hopper, who has a few things (including Hopper Hall at Dam Neck), named after her.

Re:Anonymous Coward (1)

Jay L (74152) | more than 3 years ago | (#34350624)

There is no Admiral Grace Hooper

OK, but there was Mr. Hooper on Sesame Street, whose death was handled with grace, which was admirable. So that's almost the same thing.

Re:Anonymous Coward (2, Funny)

NoOneInParticular (221808) | more than 3 years ago | (#34344498)

Actually, Hamming is a bit overrated. He was a tireless self-promotor who himself named these things after himself. Really a no-no. Hamming codes and Hamming distance are fairly simple constructs and by no means the first, the last, or the most significant in the field. The real giant in the field is of course Claude Shannon, who, in my opinion, has been more important for the field of computing as a whole than even Alan Turing himself (Shannon's master thesis for instance proved how to do arbitrary boolean circuitry in hardware. 1932). Hamming is just a footnote. He found a good algorithm, and ran with it for fame and glory.

Re:Anonymous Coward (1)

CODiNE (27417) | more than 3 years ago | (#34344578)

If you went to a group of college seniors in computer science, how many of them would have ever heard about Grace Hooper? First classmen in the Naval Academy? (I would sure hope that the U.S. Naval Academy at least would have taught their computer science cadets something about Admiral Hooper, especially if they get assigned to the USS Hooper).

A very nice post but it made me go "Ouch!" every time you called her "Hooper".

Re:Anonymous Coward (0)

Anonymous Coward | more than 3 years ago | (#34344820)

LZW compression?? I think the names are Lempel, Ziv and W. something (sorry). But then, it's Grace Hopper, which is easy to remember if you think of a hopper full of punch cards. :)

Re:Anonymous Coward (1)

Sulphur (1548251) | more than 3 years ago | (#34347128)

Terry Welch.

Re:Correcting Grace's name (1)

geekcoach (1046108) | more than 3 years ago | (#34345048)

Not to hijack the information about important work by Hamming, but those interested in Grace will have an easier time locating information about her searching on Grace Hopper, Navy Rear Admiral, Lower Half. Her 1986 David Letterman interview: http://www.youtube.com/watch?v=57bfxsiVTd4 [youtube.com]

Re:Anonymous Coward (1)

t14m4t (205907) | more than 3 years ago | (#34354932)

Midshipmen majoring in Computer Science at the US Naval Academy (my major and alma mater, class of '00) are indeed cognizant of Admiral Hopper, though I don't think there's anything specifically that teaches about her contributions. Part of this (and here I start to hypothesize) is the relative age - ADM Hopper's contributions, though extremely important and noteworthy, are relatively recent, in comparison to the rest of what goes on at USNA - the goal is, after all, to provide highly technically-trained graduates to drive ships, not go on to academic careers. Much of the infrastructure and heritage stems from the people and events of the Revolutionary War (aka "War for American Independence") through World War II, heavily favoring the mid- to late-1800's. Operational topics before and after that (and during, to give meaning and context to the heritage) are taught in classroom settings. But though ADM Hopper's contributions to the field of computer science are important, at best it's the contributions that are taught (not the name), and definitely not in an operational context (she spent her entire career as a reservist and rarely was operational).

Weylin

See Feynman's Lectures on Computation (5, Informative)

radio4fan (304271) | more than 3 years ago | (#34343686)

Feynman's excellent book 'Lectures on Computation' has a fantastic explanation of Hamming codes and distance, error correction etc.

If you're even remotely interested in information theory you *must* read this book! No prior knowledge required.

If you're a cheap bastard I'm sure you can find a pdf, but it's well worth the asking price.

Re:See Feynman's Lectures on Computation (3, Insightful)

puto (533470) | more than 3 years ago | (#34343812)

Anything by Feynman is worth it. I am 41 years old and i remember seeing super 8 videos of him in grammar school from my science teacher. Brilliant and engaging.

Anything by Feynman... (3, Interesting)

refactored (260886) | more than 3 years ago | (#34345266)

... is excessively vague and handwaving requiring literally hours and pages of close work to fill in the gaps between the equation N he shows and the next equation N+1 that he says follows from N.

Yup, a brilliant guy I'm sure, but not the guy I want teaching me. At the end of a course, (call me greedy), _I_ want to know how to do everything in the course, not merely have a warm fuzzy WEE-WOW feeling that something exciting just went by that I can't quite reproduce.

Give me Richard Hamming's books instead of Feynman's any day. Ok, they won't make you go "WEEE WOW!!!", but on the other hand you will have an excellent understanding of the material AND be able to reproduce and USE any result in them yourself.

Re:Anything by Feynman... (1)

godunc (1836034) | more than 3 years ago | (#34347408)

At the end of a course, (call me greedy), _I_ want to know how to do everything in the course, not merely have a warm fuzzy WEE-WOW feeling that something exciting just went by that I can't quite reproduce.

Holy flamebait; you are talking about America's most famous physics teacher. You were expecting a "how to" book from a nobel laureate?

Re:Anything by Feynman... (1)

refactored (260886) | more than 3 years ago | (#34348696)

Flame bait nothing, wake up call to see past the hype... yes.

Maybe if I was a Nobel Laureate material... I'd agree with you.

However, Feynman is not a suitable teacher neither for I, nor for that 99.9% segment of Humanity we call "mere ordinary mortals".

My wife, bless her, is quite capable of digesting and following Feynman's books, but then her skills are "world class" (to me "goddess-like").

By far most maths / physics university graduates, let's be honest now, get a "Whee! Wow! That was exciting!" feeling when reading Feynman...

Very very few can then move on to doing anything useful with what they should have just learnt.

Learning to do stuff with that material takes either genius, or days and days of hard page after page after page of slogging through the math. (Often both)

Re:Anything by Feynman... (0)

Anonymous Coward | more than 3 years ago | (#34357852)

Fuck you. I'm a new physicist - fresh out of college, well at least fresh into graduate school - and the Feynman lectures, both the books and in recorded lectures, helped give me the bigger picture of physics. And that was what I was missing from all of my physics classes - how it all ties together, from astrophysics to general relativity to quantum electrodynamics. Each class teaches you how some small part of the world works, but nobody ties anything together - its left as an exercise to the student. Fuck that, and fuck you for ignoring how helpful one of the two most brilliant men in the history of the world can be to a serious student - what works for you may not work for somebody else.

Re:See Feynman's Lectures on Computation (1)

travisco_nabisco (817002) | more than 3 years ago | (#34343880)

I'll have to look this up. I took an undergrad engineering elective in Error Coding and found it to be one of the most fascinating subjects I have been exposed to. The mathematics behind it really are amazing.

Re:See Feynman's Lectures on Computation (3, Funny)

martin-boundary (547041) | more than 3 years ago | (#34345224)

If you're a cheap bastard I'm sure you can find a pdf, but it's well worth the asking price.

Exactly. Remember kids, the money you spend goes directly to Richard Feynman, so he can continue to write excellent books from beyond the grave.

TFA mentions Mackay's book. (1)

serviscope_minor (664417) | more than 3 years ago | (#34343688)

TFA mentions Mackay's book. It is an awesome book, and is free online. I have a dead-tree copy, too. Well worth the price.

Re:TFA mentions Mackay's book. (0)

Anonymous Coward | more than 3 years ago | (#34343862)

Any ePub/Mobi version available?

Another Mackay book. (2, Informative)

northerner (651751) | more than 3 years ago | (#34344688)

Mackay also has another book which may be interesting.

"Sustainable energy - without the hot air", available as a free PDF download.

I haven't read it yet, but I will given his credibility with the article and other book.

http://www.withouthotair.com/ [withouthotair.com] or http://www.inference.phy.cam.ac.uk/withouthotair/ [cam.ac.uk]

Here is a podcast of a lecture he gave at Cambridge on the topic of sustainable energy.

http://mediaplayer.group.cam.ac.uk/component/option,com_mediadb/task,play/idstr,CU-CSF-Lectures_2008-12_David_MacKay/vv,-1/Itemid,42 [cam.ac.uk]

Impossible (0, Funny)

Anonymous Coward | more than 3 years ago | (#34343786)

I've been assured many times by Slashdotters that the only reason we have technology and computers is because of the '60s Space Race. I refuse to believe that people are smart by default and discover things on their own. Obviously there can only be progress when there's rockets or people floating around in free fall doing nothing. So clearly, Hamming codes were invented in space, by Mars colonists mining asteroids or something. This whole "telecom" thing and using computers for scientific purposes is a fad.

Re:Impossible (1)

$RANDOMLUSER (804576) | more than 3 years ago | (#34343948)

Not to burst your delusions, but MUCH of the Bell Labs research in this area had to do with communications (i.e. telephone) SATELLITES.

Re:Impossible (0)

Anonymous Coward | more than 3 years ago | (#34345272)

None of which has the least to do with manned space exploration, colonies or mining asteroids. I agree. It also shows that space uses technology AFTER it was invented, not the other way around which is what Space Nutters think. Thanks for backing me up, us rational sane adults need to stick together!

Re:Impossible (0)

Anonymous Coward | more than 3 years ago | (#34344470)

Slow down cowboy! Joe Biden said that every invention, every discovery, every innovation required government intervention. He says he's smarter than we are, so we should believe him.

first thought..... (0, Offtopic)

hoytak (1148181) | more than 3 years ago | (#34343892)

Hamcode, hamcode, where you been? Around the world and I'm going again...

Not the First Discovery in Coding Theory (1)

asnelt (1837090) | more than 3 years ago | (#34343914)

Hamming code was the first discovery in an immense field called coding theory

First discovery? I would say Shannon's historic paper on coding theory "A Mathematical Theory of Communication" from 1948 was earlier.

Re:Not the First Discovery in Coding Theory (0)

Anonymous Coward | more than 3 years ago | (#34344430)

the TFA makes it clear that they were contemporaneous. And, if you read the lecture by Hamming that is linked from the TFA, you'll get even more insight.

Re:Not the First Discovery in Coding Theory (1)

scatter_gather (649698) | more than 3 years ago | (#34344434)

To summarize the article that you seemed not to have read, Shannon is cited as writing the seminal paper to which you refer, and in it created an existence proof for error correction codes. He did not, in his paper, actually go so far as to create an ECC. According to TFA, Shannon is credited with creating the entire field of information theory. Not a bad accomplishment. Hamming was noted as actually creating ECCs and laying the foundation stone for coding theory. It's probably why they named the codes after him, hmm? Many codes more suited to today's computational needs have been developed since, but someone had to be first.

Re:Not the First Discovery in Coding Theory (3, Interesting)

rrohbeck (944847) | more than 3 years ago | (#34344574)

Shannon only proved that those codes exist. Hamming gave the first examples. So it's fair to say that Shannon was about information theory, not coding theory.

Re:Not the First Discovery in Coding Theory (2, Informative)

epine (68316) | more than 3 years ago | (#34345792)

Shannon's work was general over the error model. Coding theory assumes a specific error model (such as bit error rates, insertion, deletion, magnitude distortion).

It doesn't take much wits to translate Shannon's work into a rudimentary code. Constructing codes near the bounds of optimality is extremely difficult, especially if the decoder corrects errors with better efficiency than combinatorial trial and error.

I wouldn't say Shannon's work was light on coding theory, much of which was implied pretty directly. I would say instead that his work was light on algorithmic efficiency of sophisticated codes.

By the time you're correcting 5 bit errors in a 256 bit packet, you don't want to have to search for the nearest valid codeword combinatorially.

Re:Not the First Discovery in Coding Theory (1)

serviscope_minor (664417) | more than 3 years ago | (#34346118)

By the time you're correcting 5 bit errors in a 256 bit packet, you don't want to have to search for the nearest valid codeword combinatorially.

The best codes, LDPCC, can only be decoded optimally using combinatorial search, the question "is this the optimal decode?" being NP-complete. So, they always use heuristic decoders, generally loopy BP. Fortunately, that is extraordinarily suitable for efficient hardware implementation. But it gives no bounds on error rate, nor even a guarantee of convergence.

At this point, of course, we've got far away from coding theory and into efficient and empirically good heuristics for NP-hard problems.

More info on Dr. Moon (1)

aquila.solo (1231830) | more than 3 years ago | (#34344250)

For the record, Dr. Moon is the Department Head of ECE. He gives killer lectures, but his tests will make you wish for death ;-).

I'm currently working on an MS there. Good to see him get some publicity.

Easy to understand (2, Interesting)

elbow_spur (749878) | more than 3 years ago | (#34344266)

Hamming codes are practical things, while Shannon's analyses of codes were more abstract (though still hugely useful and important)
Consider the checksum bit. It helps to catch errors but there are 2 problems. First, if there is a double error (more likely if the checksum is on a longer string), then the error isn't caught Second, even if we know there is an error, we can't recover, but have to resend.
The easiest error-correcting code is to replace every bit with a triple copy of itself. So
101 becomes 111000111
This way, we can recover from any single error, but the scheme is very inefficient.
Hamming's simplest code takes a 4 bit message and adds 3 very special parity bits (think partial checksums) arranged in a clever way so that any one bit error can be isolated and corrected.
That's the basic idea. The details are many places, such as http://en.wikipedia.org/wiki/Hamming(7,4) [wikipedia.org]

Re:Easy to understand (1)

elashish14 (1302231) | more than 3 years ago | (#34347238)

Not to mention that it's a very small footprint: for N bits in a transmitted message, you only need log(N,2) parity bits to retain the same error correction/detection capability. You can pretty easily balance how robustly you want to protect your data with how much excess information you want to transmit.

How to do research like Hamming (5, Informative)

knutert (1142047) | more than 3 years ago | (#34344832)

Want to be like Hamming? Here's how:
In summary, I claim that some of the reasons why so many people who have greatness within their grasp don't succeed are:
        * they don't work on important problems,
        * they don't become emotionally involved,
        * they don't try and change what is difficult to some other situation which is easily done but is still important,
        * and they keep giving themselves alibis why they don't.
        * They keep saying that it is a matter of luck.
I've told you how easy it is; furthermore I've told you how to reform. Therefore, go forth and become great scientists!

Source: http://paulgraham.com/hamming.html [paulgraham.com]

Re:How to do research like Hamming (1)

sahai (102) | more than 3 years ago | (#34351626)

http://alum.sharif.edu/~mynaderi/Claude%20Shannon.html [sharif.edu]

That is an online transcription of Claude Shannon's thoughts on the matter. Google pointed to this link from Sharif, but you can see a printed version in the second volume of his collected works, I believe. I ran across this during my blissful graduate student days.

I love how this talk ends with him asking his audience to come and look at this machine that he built.

Reflections on Hopper and Hamming (1)

perry64 (1324755) | more than 3 years ago | (#34352322)

In my Naval career, I was lucky enough to come across both of these titans of computing’s early age. RADM Hopper gave a lecture to every plebe class at the Naval Academy , including mine in 1984, where she would give each Midshipman a short length of wire of the length that light traveled in a nanosecond. She used these to illustrate stories she told of the early days of computers that were programmed by connecting wires differently. Her speech was the first place I heard “it was easier to beg forgiveness later than get permission before.”

I wasn’t a CS/EE major, so I hadn’t previously heard of Hamming when I went to the Naval Postgraduate School in 1993 to get a master’s in CS. He was teaching there as an adjunct since he retired from Bell Labs and the entire faculty talked about him as if he were God. I really didn’t know his history, and chalked it up to parochialism.

I was lucky enough to have him as my professor for Computer Automata. It was like taking physics from someone who had been a contemporary of Newton, Copernicus, Kepler, and Einstein. His stories about working on the Manhattan Project were fascinating. Whenever we came across any of the big names in early computing theory (with the possible exception of Turing, whom I’m not sure he met – I don’t remember any stories about him), Hamming had a personal story of his interaction with them. I will never get rid of my Automata book, because the margins are filled in with some of these. For example, next to the discussion of Backus-Naur form is the note, “Hamming told Backus not to become a hippie, because if he did, he would never do good work again. Backus didn’t listen, became a hippie, and did no good work again.” It really made what can be an otherwise dry class come alive, and it drove home exactly how young a field CS actually was.

A previous poster added a link to Hamming’s talk upon his retirement at Bell Labs, “You and Your Research”, which I cannot recommend highly enough. (http://paulgraham.com/hamming.html) Even if you’re not a researcher, it is worth reading. My favorite line in it is ”Given two people with exactly the same ability, the one person who manages day in and day out to get in one more hour of thinking will be tremendously more productive over a lifetime.”
Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?