The Information

A History, a Theory, a Flood

James Gleick

31 min read
1m 2s intro

Brief summary

The Information reveals how thinkers from Claude Shannon to Alan Turing redefined reality by treating information not as meaning, but as a measurable physical quantity that underpins everything from DNA to the universe itself.

Who it's for

This book is for anyone curious about the intellectual history of the digital age and the scientific concepts that define our world.

The Information

Audio & text in the Readsome app

Claude Shannon and the Mathematical Theory of Information

The fundamental challenge of communication is reproducing a message created at one point accurately at another. For most of history, this process was viewed through the lens of human meaning—the words, sounds, and ideas being exchanged. However, in 1948, Bell Labs researcher Claude Shannon transformed this perspective by treating information as a physical quantity that could be measured and stripped of its subjective meaning. While the world celebrated the invention of the transistor that same year, Shannon’s mathematical theory of communication provided the intellectual blueprint for the digital age. He introduced the "bit" as the universal unit of measurement, placing it alongside the inch or the pound as a fundamental way to quantify the world.

Before this shift, communication was tied to specific mediums like the telephone's electrical waveforms or the telegraph's dots and dashes. Shannon, who possessed a unique background in both electrical engineering and symbolic logic, saw a deeper unity. He realized that whether a message was a television signal, a spoken word, or a secret code, it could be reduced to a series of binary choices. This abstraction allowed him to bridge the gap between uncertainty and order. By focusing on the efficiency of transmission rather than the content of the message, he developed methods to compress data and eliminate the noise that traditionally distorted long-distance communication.

This new understanding of information quickly traveled beyond telephone laboratories to redefine other branches of science. Biology became an information science as researchers recognized DNA as a sophisticated code—a set of instructions for building and operating a living organism. In this view, the body is an information processor and the gene is a vehicle for data. Even physics underwent a radical change, with some scientists proposing that the universe itself is composed of information at its most basic level, suggesting that physical reality arises from the posing of yes-no questions. From the behavior of subatomic particles to the vast movements of financial markets, the flow and storage of bits became a primary lens through which we understand existence. The evolution of human culture is essentially the story of information becoming aware of itself. Each technological leap, from the alphabet and printing press to the computer, has expanded our capacity to store and manipulate knowledge. We have moved from gathering food to gathering data, recognizing that information is not just a tool we use, but the vital principle that runs the world.

Full summary available in the Readsome app

Get it on Google PlayDownload on the App Store

About the author

James Gleick

James Gleick is an American author, journalist, and historian of science whose work chronicles the cultural impact of modern technology. A former reporter and editor for *The New York Times*, he is renowned for his ability to explain complex subjects through narrative nonfiction, with several of his books earning nominations for the Pulitzer Prize and National Book Award. His influential works explore topics ranging from chaos theory to the history of information, have been translated into more than thirty languages, and have made scientific concepts like the "butterfly effect" widely known.

Similar book summaries