Fortune's Formula
which Shannon’s theory is pointedly not about.
    John von Neumann of Princeton’s Institute for Advanced Study advised Shannon to use the word entropy . Entropy is a physics term loosely described as a measure of randomness, disorder, or uncertainty. The concept of entropy grew out of the study of steam engines. It was learned that it is impossible to convert all the random energy of heat into useful work. A steam engine requires a temperature difference to run (hot steam pushing a piston against cooler air). With time, temperature differences tend to even out, and the steam engine grinds to a halt. Physicists describe this as an increase in entropy. The famous second law of thermodynamics says that the entropy of the universe is always increasing. Things run down, fall apart, get used up.
    Use “entropy” and you can never lose a debate, von Neumann told Shannon—because no one really knows what “entropy” means. Von Neumann’s suggestion was not entirely flippant. The equation for entropy in physics takes the same form as the equation for information in Shannon’s theory. (Both are logarithms of a probability measure.)
    Shannon accepted von Neumann’s suggestion. He used both the word “entropy” and its usual algebraic symbol, H . Shannon later christened his Massachusetts home “Entropy House”—a name whose appropriateness was apparent to all who set eyes on its interior.
    “I didn’t like the term ‘information theory,’” Robert Fano said. “Claude didn’t like it either.” But the familiar word “information” proved too appealing. It was this term that has stuck, both for Shannon’s theory and for its measure of message content.



The Bandwagon
     
    S HANNON WENT FAR BEYOND the work of his precursors. He came up with results that surprised everyone. They seemed almost magical then. They still do.
    One of these findings is that it is possible, through the encoding of messages, to use virtually the entire capacity of a communication channel. This was surprising because no one had come anywhere close to that in practice. No conventional code (Morse code, ASCII, “plain English”) is anywhere near as efficient as the theory said it could be.
    It’s as if you were packing bowling balls into an orange crate. You’re going to find that there’s a lot of unused space no matter how you arrange the bowling balls, right? Imagine packing bowling balls so tightly that there’s no empty space at all—the crate is filled 100 percent with bowling balls. You can’t do this with bowling balls and crates, but Shannon said you can do it with messages and communications channels.
    Another unexpected finding involves noise. Prior to Shannon, the understanding was that noise may be minimized by using up more bandwidth. To give a simple example, you might take the precaution of sending the same message three times ( Pick up shampoo—
    Pick up shampoo—Pick up shampoo ). Maybe the other person receives Pick up shampoo—Pick up Shamu—Pick up shampoo . By comparing the three versions, the recipient can identify and correct most noise errors. The drawback is that this eats up three times the bandwidth.
    Shannon proved that you can have your cake and eat it too. It is possible to encode a message so that the chance of noise errors is as small as desired—no matter how noisy the channel—and do this without using any additional bandwidth. This defied the common sense of generations of engineers. Robert Fano remarked,
    To make the chance of error as small as you wish? Nobody had ever thought of that. How he got that insight, how he even came to believe such a thing, I don’t know. But almost all modern communication engineering is based on that work.
     
    Initially it was hard to imagine how Shannon’s results would be used. No one in the 1940s pictured a day when people would navigate supermarket aisles with a mobile phone pressed to the side of their face. Bell Labs’ John Pierce had his doubts about

Similar Books