1949 full book edition | |
| Author | Claude E. Shannon |
|---|---|
| Language | English |
| Subject | Communication theory |
Publication date | 1948 |
| Publication place | United States |
"A Mathematical Theory of Communication" is an article bymathematicianClaude Shannon published inBell System Technical Journal in 1948.[1][2][3][4] It was renamedThe Mathematical Theory of Communication in the 1949 book of the same name,[5] a small but significant title change after realizing the generality of this work. It has tens of thousands of citations, being one of the most influential and cited scientific papers of all time,[6] as it gave rise to the field ofinformation theory, withScientific American referring to the paper as the "Magna Carta of theInformation Age",[7] while the electrical engineerRobert G. Gallager called the paper a "blueprint for the digital era".[8] HistorianJames Gleick rated the paper as the most important development of 1948, placing thetransistor second in the same time period, with Gleick emphasizing that the paper by Shannon was "even more profound and more fundamental" than the transistor.[9]
It is also noted that "as didrelativity andquantum theory, information theory radically changed the way scientists look at the universe".[10] The paper also formally introduced the term "bit" and serves as its theoretical foundation.[11]
The article was the founding work of the field of information theory. It was later published in 1949 as a book titledThe Mathematical Theory of Communication (ISBN 0-252-72546-8), which was published as apaperback in 1963 (ISBN 0-252-72548-4). The book contains an additional article byWarren Weaver, providing an overview of the theory for a more general audience.[12]

This work is known for introducing the concepts ofchannel capacity as well as thenoisy channel coding theorem.
Shannon's article laid out the basic elements of communication:
It also developed the concepts ofinformation entropy,redundancy and thesource coding theorem, and introduced the termbit (which Shannon credited toJohn Tukey) as a unit of information. It was also in this paper that theShannon–Fano coding technique was proposed – a technique developed in conjunction withRobert Fano.
The choice of a logarithmic base corresponds to the choice of a unit for measuring information. If the base 2 is used the resulting units may be called binary digits, or more brieflybits, a word suggested byJ. W. Tukey.