A Mathematical Theory Of Communication
   HOME

TheInfoList



OR:

"A Mathematical Theory of Communication" is an article by
mathematician A mathematician is someone who uses an extensive knowledge of mathematics in their work, typically to solve mathematical problems. Mathematicians are concerned with numbers, data, quantity, mathematical structure, structure, space, Mathematica ...
Claude E. Shannon published in '' Bell System Technical Journal'' in 1948. It was renamed ''The Mathematical Theory of Communication'' in the 1949 book of the same name, a small but significant title change after realizing the generality of this work. It has tens of thousands of citations, being one of the most influential and cited scientific papers of all time, as it gave rise to the field of
information theory Information theory is the mathematical study of the quantification (science), quantification, Data storage, storage, and telecommunications, communication of information. The field was established and formalized by Claude Shannon in the 1940s, ...
, with ''
Scientific American ''Scientific American'', informally abbreviated ''SciAm'' or sometimes ''SA'', is an American popular science magazine. Many scientists, including Albert Einstein and Nikola Tesla, have contributed articles to it, with more than 150 Nobel Pri ...
'' referring to the paper as the "
Magna Carta (Medieval Latin for "Great Charter"), sometimes spelled Magna Charta, is a royal charter of rights agreed to by King John of England at Runnymede, near Windsor, on 15 June 1215. First drafted by the Archbishop of Canterbury, Cardin ...
of the
Information Age The Information Age is a historical period that began in the mid-20th century. It is characterized by a rapid shift from traditional industries, as established during the Industrial Revolution, to an economy centered on information technology ...
", while the electrical engineer Robert G. Gallager called the paper a "blueprint for the digital era". Historian James Gleick rated the paper as the most important development of 1948, placing the
transistor A transistor is a semiconductor device used to Electronic amplifier, amplify or electronic switch, switch electrical signals and electric power, power. It is one of the basic building blocks of modern electronics. It is composed of semicondu ...
second in the same time period, with Gleick emphasizing that the paper by Shannon was "even more profound and more fundamental" than the transistor. It is also noted that "as did relativity and quantum theory, information theory radically changed the way scientists look at the universe". The paper also formally introduced the term " bit" and serves as its theoretical foundation.


Publication

The article was the founding work of the field of information theory. It was later published in 1949 as a book titled ''The Mathematical Theory of Communication'' (), which was published as a
paperback A paperback (softcover, softback) book is one with a thick paper or paperboard cover, also known as wrappers, and often held together with adhesive, glue rather than stitch (textile arts), stitches or Staple (fastener), staples. In contrast, ...
in 1963 (). The book contains an additional article by Warren Weaver, providing an overview of the theory for a more general audience.


Contents

This work is known for introducing the concepts of channel capacity as well as the noisy channel coding theorem. Shannon's article laid out the basic elements of communication: *An information source that produces a message *A transmitter that operates on the message to create a
signal A signal is both the process and the result of transmission of data over some media accomplished by embedding some variation. Signals are important in multiple subject fields including signal processing, information theory and biology. In ...
which can be sent through a channel *A channel, which is the medium over which the signal, carrying the information that composes the message, is sent *A receiver, which transforms the signal back into the message intended for delivery *A destination, which can be a person or a machine, for whom or which the message is intended It also developed the concepts of information entropy, redundancy and the source coding theorem, and introduced the term bit (which Shannon credited to
John Tukey John Wilder Tukey (; June 16, 1915 – July 26, 2000) was an American mathematician and statistician, best known for the development of the fast Fourier Transform (FFT) algorithm and box plot. The Tukey range test, the Tukey lambda distributi ...
) as a unit of information. It was also in this paper that the Shannon–Fano coding technique was proposed – a technique developed in conjunction with Robert Fano.


References


External links


(PDF) "A Mathematical Theory of Communication" by C. E. Shannon
(reprint with corrections) hosted by th
Harvard Mathematics Department
at
Harvard University Harvard University is a Private university, private Ivy League research university in Cambridge, Massachusetts, United States. Founded in 1636 and named for its first benefactor, the History of the Puritans in North America, Puritan clergyma ...
** Original publications: ,
Khan Academy video about "A Mathematical Theory of Communication"
{{DEFAULTSORT:Mathematical Theory of Communication 1963 non-fiction books Information theory Computer science books Mathematics books Mathematics papers Works originally published in American magazines 1948 documents Works originally published in science and technology magazines Texts related to the history of the Internet Claude Shannon