eldavojohn writes "The Information: A History, a Theory, a Flood by James Gleick has a rather nebulous title and the subtitle doesn't really help one understand what this book hopes to be about. The extensive citations are welcomed as the author barely scratches the surface of any theory of information. It also cherry picks odd and interesting facets of the history of information but presents them in a chronologically challenged order. This book is, however, a flood and as a result it could best be described as a rambling, romantic love note to Information — eloquently written and at times wondrously inspiring but at the same time imparting very little actual knowledge or tools to the reader. If I were half my age, this book would be the perfect fit for me (just like Chaos was) but knowing all the punchlines and how the story ends ahead of time rather ruined it for me. While wandering through interesting anecdotes, Gleick masks the reader from most of the gory details.
The book starts out with an introduction to the hero of The Information: Claude Shannon. It also introduces the hero's sidekick: Alan Turing. Aside from our initial introduction to Shannon's work at Bell Labs and his monumental paper from 1948, the author drops many names — a foreshadowing of what is to come in the book. George Campbell, George Boole, Norbert Wiener, Vannevar Bush, John Archibald Wheeler, Richard Dawkins and many many more. This sets the tone for the rest of the book as each chapter jumps around in time and grabs many quotations and excerpts to provide a gem studded narration by Gleick.
Chapter one provided me a piece of anecdotal information that I had actually never come across. It concerns the talking drums of Africa, an apparently ill-documented form of communication that existed in Africa. Rather, I had heard of the talking drums but never considered it in a context of information theory. It appears to be one of the earliest forms of long distance communication, predating all telegraphs. A drummer in one village would drum out the syllables and nuances in a lengthy sentence and often repeat it a few times. Drummers in distant villages would hear this and try to parse out what the drums were saying. As a result of this, they wouldn't just say 'moon' they would say something like 'the shiny white face that rises in the night' or something lengthier to ensure that the message was interpreted correctly. An ingenuous method of communicating, the chapter oddly never mentions parity bits or error detection, two things I basically equated with the additional words that were redundant. It does, of course, return to our hero Shannon who would later investigate the redundancy in the English language.
The next chapter concerns Walter J. Ong and his work concerning the persistence of information. Gleick discusses the find at Uruk and the subsequent deciphering of the cuneiform tablets. What was interesting about these tablets, however, is that they were inane things like bills and recipes. But when Donald Knuth saw one at a museum, he called what he read 'an algorithm.' The third chapter jumps to 1604 and the publishing of the very first dictionaries. Although amusing, this chapter merely extrapolates how difficult it was for us to codify our language (and still is nigh impossible). At the end Gleick translates this effort to cyberspace and similar problems.
The next chapter introduces Charles Babbage and his difference engine. To keep it interesting, Gleick includes excerpts from Charles Dickens, Edgar Allan Poe, Oliver Wendell Holmes and Lord Byron. And oddly enough there was some mentor relationship between Charles Babbage and Augusta Ada Byron King, Countess of Lovelace. Concerning Babbage, Gleick calls Ada 'first his acolyte and then his muse' for some reason this odd relationship is preserved in The Information. Lady Lovelace had many intuitions into how symbolic logic and algorithms would work in the future but I found much of this chapter to be concerning relationships and excerpts from letters. To give you an example of what I'm talking about, I learned that Ada died many years before Babbage of cancer of the womb and she took laudanum and cannabis to ease the pain. What does this have to do with The Information? You also learn that Babbage told a friend before his death that he would gladly give up whatever time he had left if he could spend three days five centuries in the future. Only one of the many stories of foolishly optimistic hope this book sells to the reader.
The next chapter involves the evolution of the telegraph. And the bulk of it concentrated on a telegraph that was quite unknown to me. The French Telegraph — or rather system of signs from high buildings — that could send messages by signaling from village to village. Aside from being an extrapolation of a binary signal from ages of yore like the lighting of fires on elevated land or smoke signals, I didn't really understand why the politics and problems of these devices were explored so in depth. When we finally get to the electric telegraph, we get some odd (albeit interesting) details about it instead of the theory. From the abbreviation of common sentences down to codewords to the fight of patenting the signaling mechanism, Gleick again avoids any sort of real numerical or even technical analysis of how humans were progressing from one bandwidth level to another. Cost per letter drove some odd advancements like acronyms and the investigation of how words could be encoded into less symbols. It ends with a reference to George Boole and logic as these symbolic representations lead the way for words to be replaced and turned into equations.
The book moves on to Claude Shannon and briefly touches on his work on signal noise. It jumps around to Russell and Whitehead's Principia Mathematica and Gödel's subsequent destruction of any dreams of representing everything with symbols by way of his famous Incompleteness Theorem. It goes on to talk about Weyl, Nyquist, Hartley, etc continuing the veritable who's who while providing very little actual knowledge of their work. Who could mention Gödel without also talking about Nazis? Certainly not Gleick. The politics of the time, the references back to Lovelace and Babbage dominate this chapter leaving very little room for any actual Information Theory. On page 201 you'll find H = n log s. Although you won't find more than a paragraph of explanation nor any extrapolations on that formula. Thsi chapter did yield something interesting — a piece of paper from Shannon's estimates of data storage on a logarithmic scale. While some estimates are close, others are very far off but he was already thinking of DNA as information storage. The anecdotes and quotations from peers of the time are impressively researched and cross referenced but at what cost?
The next chapter concentrates on the enemy: Norbert Wiener from MIT. He comes across as a cigar smoking, condescending, self involved, snobby professor who's primary contribution is a now defunct 'science' once called Cybernetics. He's quick to identify other's works as derivatives as his own and is presented as the antithesis to Claude Shannon who is portrayed as modest, cautious, well spoken. On top of that, not only is Shannon's work not defunct it is the basis of so much of everything that is useful today. Gleick portrays Wiener so negatively I almost wondered if the condescending label 'wiener' was somehow related to Norbert. This chapter delves into conferences once held and the interactions between the participants. While it lead for great humor in Shannon/Wiener interactions, I don't understand why they were relayed to the reader. Shannon's rat and its demonstration resulted in interesting remarks but I don't understand why the reader is given so much insight into these proceedings of Cybernetics when the field turned out to be little more than buzzwords. An interesting note, however, is how some of the members would let the media run away with phrases that the scientist had never actually said. They would do this almost strategically to both validate this new field and provide interest from Universities and funding sources
This next chapter on Maxwell's demon and entropy was actually a little enlightening in that it provided a fairly clear discussion of entropy (physics) and entropy (information). In addition to this correlation, it discusses why it's often negentropy or negative entropy. Leo Szilárd's work is discussed as well as this concept that 'information is not free.' Although Maxwell's demon is simply a exercise in physics philosophy, this chapter begins what will be finished later: an English explanation of how information is fundamentally tied to matter and the universe.
Gleick now reaches biological information: DNA. He spends a chapter on the origins of DNA and how contemporaries of information theory approached it upon its inception. Of course Dawkins and Gould had interesting things to say in this chapter but also Hofstadtler and Gamow had perhaps the most interesting things to add. That DNA is essentially a number and that number represents a machine that can replicate and say things about itself. One thing this book does well is build this sort of interesting relationship between information and humans. This chapter takes a stab at establishing that we are all at our cores just information in the universe. As biological beings we are feeding off of negative entropy.
The book takes a bizarre twist now into memes. That's right, chain letters and lolcats. And how they replicate and infect our brain despite being nothing more than information. I found this chapter to be obvious and boring — worthy of complete removal from the text. This interjection is out of place entirely and I'm still scratching my head wondering what merit it had in this book. Since it is such an odd assortment and arrangement of the history of information, this could be skipped by the reader.
The chapter on randomness opens with an individual I've never heard of before: Gregory Chaitin. Gleick seems to imply that Incompleteness and Quantum Physics are somehow tied together by way of Turing's Uncomputability Proof — or so Chaitin (once?) thought. Because they were both related to entropy (the word I guess) and the connection was randomness. I didn't understand why this was in here if not to mislead the reader. What follows are some of the giants work and quotes about randomness and random numbers. While mildly interesting, there's not a whole lot to be gleaned from this chapter. I did appreciate the references to Andrei Nikolaevich Kolmogorov who did some original and even parallel work on information theory behind the iron curtain. Of course the text is rife with political situations and anecdotes (i.e. Kolmogorov's run in with one of Stalin's favorite pseudoscientists). Oh and what book on information would be complete without G. H. Hardy visiting Srinivasa Ramanujan and remarking on the boring number of his taxi? The oft repeated story of the number 1,729. This anecdote feels out of place but Gleick uses it to probe the reader deeper into what randomness really means. Throw in Bach's Well-Tempered Clavier and I almost wondered if Gleick had re-read Gödel, Escher, Bach before writing this chapter.
The next chapter did actually touch on work that ties information to physics in that very basic sense of information is unable to be destroyed in our universe. The famous Preskill Hawking wager is discussed as well as the thermodynamics of computation and the resulting implications for quantum mechanics. The chapter wanders around to quantum cryptography (feeling a bit out of place) to qubits to RSA to
Was that chapter too technical for you? Don't worry, the text moves back to Wikipedia (shouldn't this have been addressed in the early chapters of dictionaries?) and actually talks about deletionism versus inclusionism and the Wikipedia debates on Pokemon articles. Of course, our old friends Babbage, Turing, Shannon, et al are brought back to somehow comment on this modern encyclopedia with quotes from Gleick like 'The universe is computing its own destiny' (for added drama that sentence is its own paragraph on page 377). Strangely enough there is no reference to Edward Fredkin throughout this book. Gleick jumps to domain name saturation on the internet and hits up 'the cloud' at the very end. I almost marvel at how many bases he can touch in one chapter. The penultimate chapter covers our inundation with news every single day of our lives probably from now to eternity. Unsurprisingly, Gleick conjures up quotes of ages long past (almost to the dark ages) of people complaining of the printing press or telegraph or newspaper or internet ruining their lives by assaulting them with information and news. Turns out 'Information Overload' is not a new concept. A chapter devoted to people complaining about too much information in a book on information seems to be too much credit for them, in my opinion.
The book really fizzles out as it tries to wrap up. Far from finalizing anything, the reader is given the concept of 'the library of babel' alongside the famous six degrees of separation. We are now more interconnected than ever before thanks to
Luckily this book has almost fifty pages of references to other books that contain far more complete and far more organized thoughts on information. I would not recommend this book to any of my colleagues unless they never went to college and never once picked up another book on Information. That said, I felt it was very well written and will no doubt continue to be sold en masse in bookstores. If anyone else read this book and came away with some very deep and profound understanding of the subject matter, I would love to hear it. Right now, the audience for this book is very small in my mind. It might best be given to a young engineer who has yet to go to college but has the vim and vigor to track down the real sources of The Information."
Link to Original Source