James Gleick’s compact history of information is bigger on the inside

The Information cover

[The Information: A History, a Theory, a Flood was published in 2011, but I read it only recently. It impressed me enough that I’ve decided to share here, with slight revisions, the comments—not intended as a full review—that I posted on Goodreads.]

James Gleick doesn’t address the beginning of human language in this book, but he seems to cover all the major developments in human communication since then. The origin and impact of writing, the challenge of conveying messages across a distance (exemplified by the talking drums of Africa as well as by the varieties of telegraph), the invention of printing, the development of information theory, the rise of computing, aspects of the history of dictionaries and encyclopedias, including the growth of Wikipedia: all that is here, along with discussions of mathematical logic, codes and code-breaking, quantum theory, cybernetics, genetics, memetics, and info-glut (a current complaint with a surprisingly long history).

Does that sound like a morass of abstraction? Quite likely. But the book is an orderly thing—quite fitting, given that order and its opposite number, entropy, weigh heavily here; it’s clear, elegant, and in many ways quite down-to-earth. It has a story to tell, which progresses more or less chronologically from prehistory to the present. It even has a hero: Claude Shannon, a mathematician employed by Bell Telephone Laboratories who, in the middle of the 20th century, founded the entire field of information theory. You may not know his name, but you will have heard one of his ideas; in a groundbreaking paper published in 1948, the same year the transistor was invented (also at Bell Labs), Shannon proposed that units of information “may be called binary digits, or more briefly, bits.” (A tantalizing endnote reports that Shannon got the term from a Bell Labs researcher named John W. Tukey.) As Gleick unfolds his tale, one begins to grasp what he’s getting at in his oddly worded title. Information is history, in that history is essentially a set of recorded statements, but it’s also something that has a history, and it’s a theory as well, and it’s another name for the sense impressions, including written words, with which the world increasingly inundates us. About halfway through, Gleick gives us Shannon’s central equation[1] (carefully explaining the summation function and the rest of the right side)—

H = -Σpilog2pi

—and goes on to say that H, the quantity it defines, is “conventionally called the entropy of a message, or the Shannon entropy, or, simply, the information.” Hence the first two words of the title. The world is a message; this book is Gleick’s way of coming to grips with what that means.

Here and there, one might wish for slight differences in the treatment. Gleick emphasizes the evanescence of oral communication and the persistence of writing, but both could stand to be qualified. Oral expressions didn’t always vanish on the wind, else Homer’s work wouldn’t have survived to be recorded, nor would cultural memories of a great flood. Likewise, the written word can be impermanent; Gleick does make this point, by reference to the burning of the library at Alexandria, but he does so late in the book, a good ways after his discussion of the transition from orality to writing.

Though Gregory Bateson makes a brief appearance, Gleick doesn’t include Bateson’s memorable suggestion that a piece of information is any difference that makes a difference. (There’s some dispute over exactly what Bateson said and exactly what it meant, but its import seems solid enough to me.) The concept Bateson was trying to elucidate nonetheless comes through clearly; it’s implicit in telegraphic transmissions, for instance, which employ a difference between sound and silence and, where there is sound, a difference in duration between a dot and a dash. And Gleick does quote—among many other choice gleanings—the version of the Second Law of Thermodynamics that Tom Stoppard conceived for a character in Arcadia, “You cannot stir things apart,” which ranks as one of the best plain-language statements of a scientific principle yet devised.

Gleick has accomplished something important here, drawing together and distilling far-flung, seemingly unrelated elements, casting all our recent talk of an information age in a far broader context. Who would expect to find in one volume Robert Burton, Charles Babbage, Jorge Luis Borges, and Francis Crick? (Very few women appear, among them Ada Lovelace.) In a way, the book does what science itself does: it reveals an unseen order. I found it wondrous even though I already knew much of this in outline. The Information, with 426 pages of text, feels enormous (but isn’t—like Doctor Who’s vehicle, it’s bigger on the inside) and complete yet may leave you wanting more. Those who do can pursue Gleick’s sources in the notes and the bibliography, which together occupy 74 further pages.

[1] I’ve reproduced the equation as Gleick gives it. It sometimes appears in slightly different forms, such as here.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s