How Information Became Measurable
For most of history, people thought about communication in terms of meaning. A message mattered because of what it said, who sent it, and how it was understood. Then, in 1948, Claude Shannon changed the picture. He showed that information could be treated as something measurable, separate from meaning, and this turned communication into a scientific problem.
Shannon asked a simple but powerful question: how can a message created in one place be reproduced accurately somewhere else? To answer it, he ignored emotion, intention, and interpretation. He focused instead on signals, choices, and uncertainty. This allowed him to define the bit, the basic unit of information, as the smallest possible choice between two alternatives.
That idea made many different systems suddenly look alike. A telephone call, a telegraph message, a radio signal, and a coded military transmission all became versions of the same thing. They were all ways of turning messages into signals, moving them through a channel, and protecting them from noise. Once information could be counted, engineers could think clearly about compression, error correction, and efficiency.
This new way of thinking spread far beyond engineering. Scientists began to describe genes as coded instructions, brains as processors of signals, and even the physical universe as something deeply tied to information. The modern world did not just gain more messages. It gained a new belief that information itself is one of the basic features of reality.



