Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Book Review: The Information: a History, a Theory, a Flood

samzenpus posted more than 2 years ago | from the read-all-about-it dept.

Books 44

eldavojohn writes "The Information: A History, a Theory, a Flood by James Gleick has a rather nebulous title and the subtitle doesn't really help one understand what this book hopes to be about. The extensive citations are welcomed as the author barely scratches the surface of any theory of information. It also cherry picks odd and interesting facets of the history of information but presents them in a chronologically challenged order. This book is, however, a flood and as a result it could best be described as a rambling, romantic love note to Information — eloquently written and at times wondrously inspiring but at the same time imparting very little actual knowledge or tools to the reader. If I were half my age, this book would be the perfect fit for me (just like Chaos was) but knowing all the punchlines and how the story ends ahead of time rather ruined it for me. While wandering through interesting anecdotes, Gleick masks the reader from most of the gory details." Read on for the rest of eldavojohn's review.The book starts out with an introduction to the hero of The Information: Claude Shannon. It also introduces the hero's sidekick: Alan Turing. Aside from our initial introduction to Shannon's work at Bell Labs and his monumental paper from 1948, the author drops many names — a foreshadowing of what is to come in the book. George Campbell, George Boole, Norbert Wiener, Vannevar Bush, John Archibald Wheeler, Richard Dawkins and many many more. This sets the tone for the rest of the book as each chapter jumps around in time and grabs many quotations and excerpts to provide a gem studded narration by Gleick.

Chapter one provided me a piece of anecdotal information that I had actually never come across. It concerns the talking drums of Africa, an apparently ill-documented form of communication that existed in Africa. Rather, I had heard of the talking drums but never considered it in a context of information theory. It appears to be one of the earliest forms of long distance communication, predating all telegraphs. A drummer in one village would drum out the syllables and nuances in a lengthy sentence and often repeat it a few times. Drummers in distant villages would hear this and try to parse out what the drums were saying. As a result of this, they wouldn't just say 'moon' they would say something like 'the shiny white face that rises in the night' or something lengthier to ensure that the message was interpreted correctly. An ingenuous method of communicating, the chapter oddly never mentions parity bits or error detection, two things I basically equated with the additional words that were redundant. It does, of course, return to our hero Shannon who would later investigate the redundancy in the English language.

The next chapter concerns Walter J. Ong and his work concerning the persistence of information. Gleick discusses the find at Uruk and the subsequent deciphering of the cuneiform tablets. What was interesting about these tablets, however, is that they were inane things like bills and recipes. But when Donald Knuth saw one at a museum, he called what he read 'an algorithm.' The third chapter jumps to 1604 and the publishing of the very first dictionaries. Although amusing, this chapter merely extrapolates how difficult it was for us to codify our language (and still is nigh impossible). At the end Gleick translates this effort to cyberspace and similar problems.

The next chapter introduces Charles Babbage and his difference engine. To keep it interesting, Gleick includes excerpts from Charles Dickens, Edgar Allan Poe, Oliver Wendell Holmes and Lord Byron. And oddly enough there was some mentor relationship between Charles Babbage and Augusta Ada Byron King, Countess of Lovelace. Concerning Babbage, Gleick calls Ada 'first his acolyte and then his muse' for some reason this odd relationship is preserved in The Information. Lady Lovelace had many intuitions into how symbolic logic and algorithms would work in the future but I found much of this chapter to be concerning relationships and excerpts from letters. To give you an example of what I'm talking about, I learned that Ada died many years before Babbage of cancer of the womb and she took laudanum and cannabis to ease the pain. What does this have to do with The Information? You also learn that Babbage told a friend before his death that he would gladly give up whatever time he had left if he could spend three days five centuries in the future. Only one of the many stories of foolishly optimistic hope this book sells to the reader.

The next chapter involves the evolution of the telegraph. And the bulk of it concentrated on a telegraph that was quite unknown to me. The French Telegraph — or rather system of signs from high buildings — that could send messages by signaling from village to village. Aside from being an extrapolation of a binary signal from ages of yore like the lighting of fires on elevated land or smoke signals, I didn't really understand why the politics and problems of these devices were explored so in depth. When we finally get to the electric telegraph, we get some odd (albeit interesting) details about it instead of the theory. From the abbreviation of common sentences down to codewords to the fight of patenting the signaling mechanism, Gleick again avoids any sort of real numerical or even technical analysis of how humans were progressing from one bandwidth level to another. Cost per letter drove some odd advancements like acronyms and the investigation of how words could be encoded into less symbols. It ends with a reference to George Boole and logic as these symbolic representations lead the way for words to be replaced and turned into equations.

The book moves on to Claude Shannon and briefly touches on his work on signal noise. It jumps around to Russell and Whitehead's Principia Mathematica and Gödel's subsequent destruction of any dreams of representing everything with symbols by way of his famous Incompleteness Theorem. It goes on to talk about Weyl, Nyquist, Hartley, etc continuing the veritable who's who while providing very little actual knowledge of their work. Who could mention Gödel without also talking about Nazis? Certainly not Gleick. The politics of the time, the references back to Lovelace and Babbage dominate this chapter leaving very little room for any actual Information Theory. On page 201 you'll find H = n log s. Although you won't find more than a paragraph of explanation nor any extrapolations on that formula. Thsi chapter did yield something interesting — a piece of paper from Shannon's estimates of data storage on a logarithmic scale. While some estimates are close, others are very far off but he was already thinking of DNA as information storage. The anecdotes and quotations from peers of the time are impressively researched and cross referenced but at what cost?

The next chapter concentrates on the enemy: Norbert Wiener from MIT. He comes across as a cigar smoking, condescending, self involved, snobby professor who's primary contribution is a now defunct 'science' once called Cybernetics. He's quick to identify other's works as derivatives as his own and is presented as the antithesis to Claude Shannon who is portrayed as modest, cautious, well spoken. On top of that, not only is Shannon's work not defunct it is the basis of so much of everything that is useful today. Gleick portrays Wiener so negatively I almost wondered if the condescending label 'wiener' was somehow related to Norbert. This chapter delves into conferences once held and the interactions between the participants. While it lead for great humor in Shannon/Wiener interactions, I don't understand why they were relayed to the reader. Shannon's rat and its demonstration resulted in interesting remarks but I don't understand why the reader is given so much insight into these proceedings of Cybernetics when the field turned out to be little more than buzzwords. An interesting note, however, is how some of the members would let the media run away with phrases that the scientist had never actually said. They would do this almost strategically to both validate this new field and provide interest from Universities and funding sources ... but should anyone corner them and ask for clarifications they could always truthfully say that they never said that verbatim. I wonder how often this happens today?

This next chapter on Maxwell's demon and entropy was actually a little enlightening in that it provided a fairly clear discussion of entropy (physics) and entropy (information). In addition to this correlation, it discusses why it's often negentropy or negative entropy. Leo Szilárd's work is discussed as well as this concept that 'information is not free.' Although Maxwell's demon is simply a exercise in physics philosophy, this chapter begins what will be finished later: an English explanation of how information is fundamentally tied to matter and the universe.

Gleick now reaches biological information: DNA. He spends a chapter on the origins of DNA and how contemporaries of information theory approached it upon its inception. Of course Dawkins and Gould had interesting things to say in this chapter but also Hofstadtler and Gamow had perhaps the most interesting things to add. That DNA is essentially a number and that number represents a machine that can replicate and say things about itself. One thing this book does well is build this sort of interesting relationship between information and humans. This chapter takes a stab at establishing that we are all at our cores just information in the universe. As biological beings we are feeding off of negative entropy.

The book takes a bizarre twist now into memes. That's right, chain letters and lolcats. And how they replicate and infect our brain despite being nothing more than information. I found this chapter to be obvious and boring — worthy of complete removal from the text. This interjection is out of place entirely and I'm still scratching my head wondering what merit it had in this book. Since it is such an odd assortment and arrangement of the history of information, this could be skipped by the reader.

The chapter on randomness opens with an individual I've never heard of before: Gregory Chaitin. Gleick seems to imply that Incompleteness and Quantum Physics are somehow tied together by way of Turing's Uncomputability Proof — or so Chaitin (once?) thought. Because they were both related to entropy (the word I guess) and the connection was randomness. I didn't understand why this was in here if not to mislead the reader. What follows are some of the giants work and quotes about randomness and random numbers. While mildly interesting, there's not a whole lot to be gleaned from this chapter. I did appreciate the references to Andrei Nikolaevich Kolmogorov who did some original and even parallel work on information theory behind the iron curtain. Of course the text is rife with political situations and anecdotes (i.e. Kolmogorov's run in with one of Stalin's favorite pseudo-scientists). Oh and what book on information would be complete without G. H. Hardy visiting Srinivasa Ramanujan and remarking on the boring number of his taxi? The oft repeated story of the number 1,729. This anecdote feels out of place but Gleick uses it to probe the reader deeper into what randomness really means. Throw in Bach's Well-Tempered Clavier and I almost wondered if Gleick had re-read Gödel, Escher, Bach before writing this chapter.

The next chapter did actually touch on work that ties information to physics in that very basic sense of information is unable to be destroyed in our universe. The famous Preskill Hawking wager is discussed as well as the thermodynamics of computation and the resulting implications for quantum mechanics. The chapter wanders around to quantum cryptography (feeling a bit out of place) to qubits to RSA to ... well, it all (as it does throughout the book) comes back to Shannon. The chapter does end with an interesting quote from John Wheeler who apparently advocated translating the quantum versions of string theory and Einstein's geometrodynamics 'from the language of the continuum to the language of bit.' Sounds pretty interesting, right? Too bad all you get is the quote.

Was that chapter too technical for you? Don't worry, the text moves back to Wikipedia (shouldn't this have been addressed in the early chapters of dictionaries?) and actually talks about deletionism versus inclusionism and the Wikipedia debates on Pokemon articles. Of course, our old friends Babbage, Turing, Shannon, et al are brought back to somehow comment on this modern encyclopedia with quotes from Gleick like 'The universe is computing its own destiny' (for added drama that sentence is its own paragraph on page 377). Strangely enough there is no reference to Edward Fredkin throughout this book. Gleick jumps to domain name saturation on the internet and hits up 'the cloud' at the very end. I almost marvel at how many bases he can touch in one chapter. The penultimate chapter covers our inundation with news every single day of our lives probably from now to eternity. Unsurprisingly, Gleick conjures up quotes of ages long past (almost to the dark ages) of people complaining of the printing press or telegraph or newspaper or internet ruining their lives by assaulting them with information and news. Turns out 'Information Overload' is not a new concept. A chapter devoted to people complaining about too much information in a book on information seems to be too much credit for them, in my opinion.

The book really fizzles out as it tries to wrap up. Far from finalizing anything, the reader is given the concept of 'the library of babel' alongside the famous six degrees of separation. We are now more interconnected than ever before thanks to ... information!

Luckily this book has almost fifty pages of references to other books that contain far more complete and far more organized thoughts on information. I would not recommend this book to any of my colleagues unless they never went to college and never once picked up another book on Information. That said, I felt it was very well written and will no doubt continue to be sold en masse in bookstores. If anyone else read this book and came away with some very deep and profound understanding of the subject matter, I would love to hear it. Right now, the audience for this book is very small in my mind. It might best be given to a young engineer who has yet to go to college but has the vim and vigor to track down the real sources of The Information.

You can purchase The Information: A History, a Theory, a Flood from amazon.com. Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.

cancel ×

44 comments

Don't read this... it is a curse... (-1)

Anonymous Coward | more than 2 years ago | (#37744062)

In 2006, a little boy named Zach was sitting in the back seat of his mother's car whilst they were driving home from the grocery store. In his arms, he held a ventriloquist dummy named Slappy. Zach was rocking Slappy back and forth in his arms as if Slappy was a baby. Then, suddenly, he tried to throw Slappy out the window. However, Slappy latched onto the side of the car and climbed back in faster than Zach could even believe! Zach apologized profusely and Slappy eventually forgave him.

Then, when Zach finally arrived home, he somehow locked Slappy in his mom's car, ran up on his porch, and then proceeded to point at Slappy and laugh at the seemingly hopeless situation that he was in. This action proved to be a mistake, however. In an event that completely took Zach by surprise, Slappy turned into a twinkling star that couldn't have been much bigger than a few inches, flew directly through the car door, and then went right into Zach's bootyass!

Still stunned at this unexpected development, Zach heard Slappy saying, from the inside of his bootyass, "Iiiiiiit's suuuuuuuuuuuppeeeer tiiiiiiiiiiiiiiime!" Then, before Zach could even react to those malice-filled words, Slappy began saying, "rarara!" Slappy was consuming all of the feces in Zach's bootyass, inflicting major tickle upon it! Zach, in a desperate attempt to make the tickle stop, ran into his mom's garden and began rubbing dirt in his face. The tickle, however, only worsened...

Now that you have read this (even a single word of it), Slappy will turn into a twinkling star, fly into your bootyass, and then consume all of the feces inside it (inflicting major tickle upon your bootyass)! To prevent this from happening, copy and paste this entire comment and then repost it as a comment three times.

All comments like this should be rejected.. (0)

Anonymous Coward | more than 2 years ago | (#37744626)

Reject comments with this line:

.. copy and paste this [entire] comment and then repost it [as a comment] X times.

Creepy Book (1)

pntkl (2187764) | more than 2 years ago | (#37744122)

I got a copy of this for about $8, the other month, when Borders was closing down. Added bonus, it came with a flattened fiddler spider on the dedication page. Information is deadly. /-o-\ I've only made it through the first chapter, which wasn't a bad read. I'll post back here, if I ever find time to really read it.

Re:Creepy Book (1)

physburn (1095481) | more than 2 years ago | (#37746524)

Don't know how you can creep Claude Shannon's theory, are you confusing it with the Mass Destruction song by Faithless?

---

Information Theory [feeddistiller.com] @ Feed Distiller [feeddistiller.com]

Re:Creepy Book (1)

pntkl (2187764) | more than 2 years ago | (#37746730)

I'd post a picture, if that was allowed. When I bought the book, found a spider pressed into the dedication page. It's approximately an inch above the 'T' in 'CYNTHIA'. That's why I posted 'Creepy Book'. /-o-\

Dept. of Red. Dept. (1)

blair1q (305137) | more than 2 years ago | (#37744192)

"the redundancy in the English language. "

Yeah, that's as far as I got before being induced to TLDR and post.

The redundancy in the English language, while possibly a form of self-correcting code, often is a source of error itself.

Hence the massive proportion of internet bandwidth given up to grammar flames.

When your error detection system is capable of rat-holing your entire discussion, maybe it's better to rely on reducing the S/N ratio of your lower layers and forego sending the redundant bits...

Re:Dept. of Red. Dept. (1)

pev (2186) | more than 2 years ago | (#37748938)

Does EngIish really have that much redundancy? Most of the language has evolved by assimilating new words as required so I would have thought that this naturally eschews redundancy. A lot of what could be considered redundant could be genuinely (subtley) different meanings? Im sure that some people consider the differences small enough to be redundant whilst others consider them significant - this could perhaps be one of the most succinct ways to define the difference between computer programmers and poets...? :-D

FYI I'd definately categorise myself as the former not the latter but I do love the quirkiness of English. I'm sure (but with no citations!) that the process of learning such a nuts language as a child helps you develop in much more interesting ways than a logical and simple language would...

Re:Dept. of Red. Dept. (1)

Anonymous Coward | more than 2 years ago | (#37750362)

No, there are error correction structures built into the language itself. Think in terms of how verbs and pronouns agree. Also, read 'Godel, Escher, Bach', you're probably missing a lot of context to this discussion. The reviewed book really wants to be a mix of GEB and 'A Short History of Nearly Everything'.

Like many physicists, missed the point... (1)

Anonymous Coward | more than 2 years ago | (#37744236)

The book entertains at some depth (as do physicists today) how entropy was properly the domain of thermodynamics before Shannon gave it a newfangled interpretation: a measure of the *quantity* of information (provided the symbol stream was generated as a stationary stochastic source). Proofs of this have now been reduced to a few lines and a convex inequality.

The far more important result of Shannon's paper was the channel coding theorem, which was counterintuitive (and hence, remarkable) at the time, at which has yet to meet with a simple proof, even many decades later. (The result infers the theoretical maximal rate of reliable communication between point A and point B, known as channel capacity, and describes how error correction coding can---in principle---allow one to approach this capacity). This is the result which turned the communications field on its head, and it took them nearly four decades to find error correction codes which approached the channel capacity results of Shannon's work.

The result gets only scant mention in Gleick's book, and although I'm no expert in thermodynamics, I am not aware that any result from that field can properly be compared to the channel coding theorem without some sophisticated twists.

Shannon, Babbage, Lovelace (3, Insightful)

iliketrash (624051) | more than 2 years ago | (#37744248)

Nonetheless, Gleick's treatment of the likes of Shannon, Babbage, and Ada Lovelace are fair and fairly detailed, and will vastly enlighten the non-technical reader, which is, after all, the intended audience for this book.

Re:Shannon, Babbage, Lovelace (1)

mattack2 (1165421) | more than 2 years ago | (#37745448)

Hmm, one of the highly regarded reviews on Amazon:
http://www.amazon.com/review/R1JI5VZRW1ILWU/ref=cm_cr_dp_perm?ie=UTF8&ASIN=0375423729&nodeID=283155&tag=&linkCode= [amazon.com]

thinks it's not a general audience book:

Some of the narrative may seem pretty heavy- going. For readers who are not versed in the subject, it may seem to be almost impenetrable. After a bit, one realizes this book is not written for the general reader.

Christ, it's been ages! Where's my Aussie story? (0)

Anonymous Coward | more than 2 years ago | (#37744270)

Slashdot's Aussie-lurvin' "editors" promised me they'd post one Australian story every hour, at the minimum.

Fuck these interesting stories about stuff that matters: WHERE IS MY WORTHLESS, IRRELEVANT AUSSIE TRIVA?!

I demand to see my country mentioned in every article!

How will the World discover just how great we Aussies are?

Now excuse me, I have to go practise my American accent.

Re:Christ, it's been ages! Where's my Aussie story (0)

Anonymous Coward | more than 2 years ago | (#37744448)

Toy me kengeroo dern, spurt,
Toy me kengeroo durn!

Australians have such lovely accents. When you speak, it's like fingers down a blackboard. Australian chicks can etch glass with their voices. They sound like a hyena that has got its nuts caught on electrified barbwire.

Re:Christ, it's been ages! Where's my Aussie story (1)

tehcyder (746570) | more than 2 years ago | (#37749034)

Toy me kengeroo dern, spurt, Toy me kengeroo durn!

Australians have such lovely accents. When you speak, it's like fingers down a blackboard. Australian chicks can etch glass with their voices. They sound like a hyena that has got its nuts caught on electrified barbwire.

I believe you're thinking of the girls from not-really-neighbours New Zealand. They make Australian chicks sound like Eartha Kitt.

There's a reason sheep-shagging's so popular there.

I found this to be a deeply profound book. (1)

Anonymous Coward | more than 2 years ago | (#37744292)

Not only is the history of information theory interesting, but Gleick touches on truly fundamental relationships between classical information theory and the evolution of all self replicating patterns.

The most interesting quote in the book for me was this:

'It sometimes seems as if curbing entropy is our quixotic purpose in this universe." (p. 282). Pantheon. Kindle Edition.

While the book doesn't come to any startling conclusions, and has to deal with the historical confusion around the word 'entropy' itself, the ties that are drawn between the formation and transmission of order and its perpetual conflict with increasing disorder are fascinating to say the least. What if we look at ourselves as information bearing systems? What if that information is becoming denser and more complex over time? What is the logical conclusion to these trends?

After finishing this book I felt like I had been given the briefest glimpse of a profound truth, and then no more. In fact the last few chapters felt rather mundane, but I would still recommend this book to any and all.

P.S. Around the same time I also read "What Information Wants" and this book is much much better.

Lectures on Information History (2)

ideonexus (1257332) | more than 2 years ago | (#37744314)

If the content of this book intrigues you, I highly recommend this Lecture Series [freevideolectures.com] by UC Berkeley Online Course, Spring 2011 , Prof. Geoffrey D. Nunberg. I listened to them a few years ago when the quality was terrible, but they were still fascinating and have been re-recorded in much better quality and with slides. The course starts way back with spoken word to written word to signs on up through history. Great series.

There are problems; lack of equations isn't one (0)

vykor (700819) | more than 2 years ago | (#37744374)

Uh, maybe this book isn't for you, if all you're interested in is more theorems. The theory of information doesn't boil down to signal/noise equations. If you don't even get the Borges reference (which, honestly, no book on information can do without), that's a pretty good sign.

Geoff Nunberg's NYT review of this book from March summarizes the problems of Gleick's rah-rah information romance nicely, not the least of which is the separation of information from any and all social context.

Re:There are problems; lack of equations isn't one (1)

Anonymous Coward | more than 2 years ago | (#37744666)

The theory of information doesn't boil down to signal/noise equations.

That's exactly what it boils down to.

That and other equations.

Re:There are problems; lack of equations isn't one (2)

Daniel Dvorkin (106857) | more than 2 years ago | (#37745242)

The theory of information doesn't boil down to signal/noise equations.

If you're talking about what's generally referred to as "information theory," then yes, it does.

Not Meant to be a Textbook (0)

Anonymous Coward | more than 2 years ago | (#37744466)

Short version of the review for tldr's: My e-peen is bigger than Gleick's.

Equations or love life? (1)

vlm (69642) | more than 2 years ago | (#37744496)

I'm still confused. Is this the kind of book that has at least some equations and algorithms (I get that its not exclusively this) or is it the kind of book that mostly rampages on about Turing's love life and how the crude savages of the era screwed him over? I'm just trying to figure out how soft -n- fluffy it is.

Re:Equations or love life? (5, Informative)

robotkid (681905) | more than 2 years ago | (#37744998)

I'm still confused. Is this the kind of book that has at least some equations and algorithms (I get that its not exclusively this) or is it the kind of book that mostly rampages on about Turing's love life and how the crude savages of the era screwed him over? I'm just trying to figure out how soft -n- fluffy it is.

Neither, and therein lies the weakness of this book. This review is spot on. The beginning chapters are all these interesting historical anecdotes that do a pretty good job of contextualizing the disjointed and awkward methods of transmitting and thinking about information in the pre-Shannon era. As a series of lesser-known historical anecdotes, it's quite fascinating to know that Babbage like to crack codes as a hobby and that Shannon and Turing directly influenced each other's work as they had regular lunchtime discussions at Bell labs. That is interesting thread that got me to read the book, it felt like a great set-up to a really interesting and accessible primer to information theory.

But then, once Shannon is introduced, the author seems at a loss to explain what information theory is actually used for other than a vague sentiment that it's "useful everywhere, like in the internets and satellites and stuff". In fact, the narrative seems to fall into the same trap that is described wherein a bunch of non-mathematically inclined "visionaries" from psychologists to linguists and architects all jump on an ill-fated "information theory can explain everything" bandwagon without really understanding what it is that information theory can and can't do, leading to quasi-celebrity status for a (very bewildered) Shannon. This then devolves into an extended discussion of memes from the early work of Dawkins which met a similar fate (the "journal of mimetics" was short-lived due to a complete inability of it's founders to agree on exactly what belonged in it). The treatment of biological information is amazingly scant beyond some re-hashing of Dawkins and Gould, given how fundamental information theory is to the modern field of bioinformatics and the like. It then wraps up with with obligatory creation stories of wikipedia, google and discussions of information glut, the likes of which a slashdot audience would already know by heart and therefore find unenlightening.

The actual information theory examples explained in the book do not go beyond the toy examples from Shannon's paper, which is itself very well written and eminently accessible if you have a little statistics and math background. So if you are looking for that, go straight to the source instead instead of reading this book. If you are looking for some neat historical anecdotes about what people used to do to save money on telegraph messages and dreams that Ada Loveless had about being able to see a new world in her head where algorithm developers would someday rule the world, by all means enjoy the first 5 chapters, but the remainder is quite forgettable I'm afraid.

Link to Shannon's 1948 paper. http://cm.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdf [bell-labs.com]

Re:Equations or love life? (1)

bmacs27 (1314285) | more than 2 years ago | (#37750154)

I'm a little confused by your comment about the ill-fated information theory. It still does dominate many fields. I know that psychophysics has benefitted greatly from it, and the people doing it are plenty "mathematically inclined."

Re:Equations or love life? (1)

robotkid (681905) | more than 2 years ago | (#37753342)

I'm a little confused by your comment about the ill-fated information theory. It still does dominate many fields. I know that psychophysics has benefitted greatly from it, and the people doing it are plenty "mathematically inclined."

I did not mean that information theory was ill-fated, but right after it's publication there was an irrational jubilation that all of science was going to be solved in an "information theory" framework that led to failed journals, societies, and hundreds of really poorly thought out papers all entitled "An information theory approach to ____(insert longstanding scientific problem here)". Generally these papers took the log of some important measurement, calculated an "effective bandwidth", and maintained that this was somehow a more profound way to understand the problem (the significance of which was left to the reader's imagination).

Shannon himself complained in a 1956 editorial that "Information theory has become something of scientific bandwagon. . . .A few first-rate research papers are preferable to a large number that are poorly conceived or half finished. The latter are not a credit to the writers and a waster of time to their readers."

We are way past the "information theory" bubble nowadays and the practitioners are quite mature by comparison, so it's hard to imagine the time described by these historical anecdotes. My best mental approximation would be if Craig Venter, Stephen Hawkings, and Bill Gates simultaneously held a press conference announcing that some "new kind of science" (*cough cough Wolfram*) was going to unify gene therapy research, cosmology, and social welfare in sub-saharan Africa all at once.. That's how ridiculous it got.

Perhaps that is the most useful thing I got from this book, that history shows us that scientific fads can even be based on real, ultimately transformative breakthroughs. But they will still walk and feel like a fad when people, especially out-of-field scientists, are irrationally exuberant about it.

Re:Equations or love life? (1)

bmacs27 (1314285) | more than 2 years ago | (#37754088)

Okay, but around that time there was some important work tying information theory to perception that was relatively groundbreaking work. It's still cited today, and modeling of visual cortex as "noisy channels" is still fairly widespread practice. However, maybe that makes sense because most of the common tools used in psychophysics historically came from Signal Detection Theory, and other radio operator related math.

Re:Equations or love life? (1)

robotkid (681905) | more than 2 years ago | (#37754602)

Okay, but around that time there was some important work tying information theory to perception that was relatively groundbreaking work. It's still cited today, and modeling of visual cortex as "noisy channels" is still fairly widespread practice. However, maybe that makes sense because most of the common tools used in psychophysics historically came from Signal Detection Theory, and other radio operator related math.

Agreed. IANASH (I am not a science historian) but the impression I get from the book is that it was this initial success that spawned all the other fields to try and have a "me too" moment which led to the bubble. So although I'm pretty sure the book does mention this as an event that happened, the science itself was certainly not presented with the same clarity and poignancy that the author details early developments in the mathematics of logarithms and wacky precursors to the telegraph. If you can recommend an accessible history of psychophysics, I'd be all ears :-)

Is this a first? (1)

Jeng (926980) | more than 2 years ago | (#37744520)

Is this the first time that a book review got something other than an 8?

Re:Is this a first? (1)

ThorGod (456163) | more than 2 years ago | (#37744590)

Seems like it. Unless the reviewer has a book on information theory out there, I'm more inclined to believe the criticism (after all the awkwardly positive book reviews I've read on /.)

Re:Is this a first? (0)

Anonymous Coward | more than 2 years ago | (#37750674)

Thing is, three of the ten possible points depend on the book being published by Packt.

This book was just bad. (1)

Anonymous Coward | more than 2 years ago | (#37744580)

I mean, it takes a lot of talent to take something as interesting as information theory and to write so bad and muddled about it that I had a hard time reading it. What the book needs is a good editor to cut some of the crap and make the author rewrite some bits.

I loved this book (0)

Anonymous Coward | more than 2 years ago | (#37744686)

There were definitely a couple of sections that I totally skipped over, but in general, this book has changed the way I think about the entire world. I recommend it, but it's definitely not for everyone.

I love this book (0)

Anonymous Coward | more than 2 years ago | (#37744748)

I read "The Information" from cover to cover and am reading it again. It misses the point to say that the book could be more technical. It does present some technical detail and does present the basic quantitative ideas.

It also misses the point to say that the book is chronology challenged. The beginning is a Prologue that does quote Claude Shannon and establish his 1948 articles as central to the field. But then Gleick goes back in time to the evolution of language, written language, dictionaries. In a nutshell, the world was not ready for Shannon's information theory until it had gone through some evolution of thought and established that a message could be encoded, converted from one form to another, but still be the same message. Gleick quotes some author to the effect that in pre-literate societies, people have a vocabulary no larger than 1000 words. Writing helps to establish words as separate things, and gives them permanence. Dictionaries go a little farther in establishing words as things. Finally codes such as Morse Code, or cryptographic "secret codes" confirm that a message can be put in a different form, but remain the same message.

After telegraphy was established, and Shannon himself worked on codes during World War II, then Shannon was able to state information theory and have some audience that could appreciate what he was saying. For all that, the engineers of the phone system must have been primarily electrical engineers. They were trained in voltage, current, resistance, etc. At best it was a consciousness-raising experience when Shannon stated a theory of information. One can imagine that some engineers balked. Information was abstract and beyond their training.

We learn about Galileo and the Pope and all that as a very distant example in which science was about improved awareness. Recall that pretty good calendars existed in Roman times. It was not a shock to the Pope that some almanac calculations could be done. It was a shock when the details of Galileo's observations rose to the level that they gave a new awareness. Sometimes science is about raising awareness and a new understanding of familiar things.

But Galileo was not unique. I was 4 years old when Shannon published information theory. He raised awareness of this topic in the recent past, the middle of the 20th century. (The original articles are on the web. You could start with Wikipedia or just a search.)

I have a serious interest in applications of vision and color, and in particular the application of science to LIGHTING. So-called illuminating engineers still talk about lighting as a matter of energy flows. They totally balk at the idea that lighting exists to deliver information. The book talks at some length about things that are really about information, not energy flow. Lighting is not mentioned, so it is still up to me to draw that connection.

I recommend this book to everybody. It is a tremendous effort to raise understanding of science and the idea that sometimes science is about self-awareness.

Gleick has done a lot (1)

epine (68316) | more than 2 years ago | (#37744932)

Gleick has done some highly regarded work. I waded through some material on his web site many years ago, and felt a strong respect.

From my old notes, here's an audio interview about a previous book. A Miracle Made Lyrical: Jim Gleick's Isaac Newton [harvard.edu]

Also high praise for Chaos from I Missed the Complexity Revolution [edge.org]

I don't understand how this reviewer has never heard of Chaitin, but finds this book vastly too elementary. Oddly, I mentioned Chaitin in an earlier post this very day. Perhaps reviewer should tear a page out of the Roger Ebert school of criticism:

When you ask a friend if Hellboy is any good, you're not asking if it's any good compared to Mystic River, you're asking if it's any good compared to The Punisher. And my answer would be, on a scale of one to four, if Superman is four, then Hellboy is three and The Punisher is two. In the same way, if American Beauty gets four stars, then The United States of Leland clocks in at about two.

That's a lot of words to pour out without defining expectations or genre. And there are many sub-genres within science writing.

My Shorter Review of the Book (2)

tphb (181551) | more than 2 years ago | (#37745046)

A well written review from the poster. My shorter one: If you were to drop this book into a black hole, the information content of the universe would not change.

Re:My Shorter Review of the Book (0)

Anonymous Coward | more than 2 years ago | (#37750620)

You must not have read enough of the book to know how inane that statement is.

Okay, I have to say it... (1)

TheSHAD0W (258774) | more than 2 years ago | (#37745856)

We want... Information. Information! INFORMATION!!

I read the book over lunch breaks at work.. (1)

Lordfly (590616) | more than 2 years ago | (#37746006)

As someone who classifies himself as a "geek", albeit one terribly bad at math and logic, I thought it was a pretty good read. I do wish I had some more "hello, this is information theory presented in an engaging manner" books, though.

Richard Cox... (1)

rgbatduke (1231380) | more than 2 years ago | (#37746532)

One wonders if the book points out that Richard Cox derived what amounts to information theory several years prior to Shannon...

rgb

Suggested alternatives? (0)

Anonymous Coward | more than 2 years ago | (#37747226)

Okay, so this book doesn't quite hit the mark. I'm interested in information theory, but I don't have a whole lot of time to read. What would you suggest as a good alternative?

My review of the reviewer... (2)

BenSnyder (253224) | more than 2 years ago | (#37747388)

I'm reading this book and am about 60% through it... up to the part about entropy.

I get that the reader was looking for equations. But I found the history of everything to be wonderfully helpful in understanding the general concepts. I'm confused as to why he's confused that Gleick is giving a history of Information Theory and not a discourse on it.

I give the book 4 out of 5 and the reviewer 2 out of 5.

Re:My review of the reviewer... (0)

Anonymous Coward | more than 2 years ago | (#37747572)

I agree. He gives the book a bad rap. The first 3rd is excellent, and the rest of it is decent. Of course there are better books on information theory itself, but it's about more than that. The way this book is laid out really teaches you to wear information goggles in a way I hadn't before. It's survey material, that gives people an idea of what to look for if they want to go deeper, and what areas of research are available.

5/10 is incredibly harsh, and doesn't even match the tone of the writing of the review.

The reviewer says that it is not the ideal book for those with other books on information or who are not in college. Well in my mind that is a pretty large audience. If we want the general public to understand these things, which are so fundamental to the modern age, there should be loads of similar books aimed at them. But there really aren't. Gleick's book fills an important need in my opinion.

Maybe the real problem the reviewer had with the book is the title?

Re:My review of the reviewer... (1)

thermochap1 (2488050) | more than 2 years ago | (#37756292)

The title and cover page are pretty bad. And I agree that the book has an audience in spite of the primary reviewer's reservations. After all we do need to bring the level of awareness up as many ways as possible, and most seem to agree that the book has nice stories and was pleasantly aware of Walter Ong's work. There are also other things that Gleick is clueless about. On the history side, Gleick did not have a clear picture of Gibbs' nor Jaynes' contributions to Bayesian inference as applicable beyond thermal physics. As far as the present is concerned, these omissions underpin an ignorance of the non-local nature of information as a multi-moment correlation measure (KL-divergence). This ignorance is shared by many from all walks of life, in spite of its relevance e.g. to complex systems and to quantum computing. Sharon McGrayne's "The theory that would not die" provides some other pieces to the still incomplete puzzle for the lay reader. Look for more pieces in the days ahead.

shannon versus wiener (0)

Anonymous Coward | more than 2 years ago | (#37750796)

i disagree that it's a wonderful world full of shannon, and that cybernetics is "defunct". information theory as the basis of TRANSMISSION OF KNOWN MESSAGES is great. but the consequence is that information technology is wielded mostly as an agent for industrial-age thinking: break it into small pieces to make it manageable and efficient. as a result it loses context, and also becomes so cheap, almost free, so we are drowning in too many messages. this makes the cost of productive conversations go up; it's too "expensive" --- in terms of time and attention, at a minimum --- to converse. this isn't shannon's fault --- he understood the limits of his own theory, unlike his co-author, weaver, who claimed semantic extensions to shannon's calculations of redundancy versus reliability. wrong.

meantime, cybernetic loops are the basis for intelligent action in biological systems, medicine, TQM, design, etc. extensions of cybernetic theory into conversation afford pragmatic models for designers. gleick is living in a previous century, maybe even back more than 1.

Check for New Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...