timeline of information theory

Information theory was founded by Claude Shannon in 1948 with an article he wrote that was published in the Bell System (Labs) Technical Journal. The article dealt with the mathematical engineering process involved in transmitting information over noisy connections. Shannon defined the unpredictability of information as “entropy” and applied statistical analysis techniques similar to those used in analyzing thermodynamic entropy. Boltzman described thermodynamic entropy using statistical and probability based mechanics to predict temperature dispersion in gases accomplished by random collisions. Shannon applied the same mathematical ideas to describing the integrity of information in a “noisy” environment.

1833 – Gauss and Weber used an electromagnetic telegraph for communications

1837 – Cooke and Wheatstone developed the first commercial electric telegraph

1837 – Morse patented an electric telegraph and with his assistant, Vail, developed the “Morse code” alphabet encoding system

1843 – Bain invented the first facsimile machine

1872 – Boltzmann created his “H-theorem” which used a statistics approach to describe entropy of a gas particle

1927 – von Neumann extends entropy statistics to quantum mechanics

1947 – error detection and correction codes – Hamming (published in 1950)

1948 – “A Mathematical Theory of Communication” – Shannon, describes information uncertainty using statistics, which paralleled the approach of describing entropy in thermodynamics. This was also the first use of the term, “bit” to represent a fundamental piece of information.

1951 – Huffman encoding for data compression

1962 – first satellite transmission

1967 – Viterbi algorithm

1977 – Lempel-Ziv compression

SEE ALSO:
timeline of artificial intelligence

Comments are closed.