Engineering Theory and Mathematics in the Early Development of
Information Theory Lav R. Varshney School of Electrical and Computer Engineering Cornell University
The History of Information Theory
Information Theory as Science
Formulated in 1948 by Claude E. Shannon. Two social groups, communications engineering theorists and mathematical scientists, made significant contributions to information theory in the late 1940s and early 1950s. The socially constructed meaning of information theory held by members of the groups, rather than their academic credentials serve to discriminate the two groups. The relationship between mathematicians and engineering theorists in the development of information theory has been marked by mutual interaction, synecdochic of science and technology in electronics.
Some engineering theorists constructed information theory as a type of science that would allow measurement and characterization of various sources and channels. Led to empirical and experimental characterization of information sources such as television pictures, human speech, and printed English text.
Shannon’s Initial Formulation
Krezmer, Bell System Technical Journal (1952)
Led to characterization of various existing modulation systems to determine their information “A Mathematical Theory of Communication.” rate and comparisons to the optimal (channel capacity). Pioneering work that was the culmination of mathematization of communication, however was so unique Information Theory as an Ideal for Communication System Design that it may be considered the work of a “heroic inventor.” Other engineering theorists viewed information theory as an Synthesis of communications engineering theory and ideal to work towards, for the construction of communications mathematical science ideas. systems. Perceived by Shannon as “a branch of mathematics, a Notable successes came in code design, including the strictly deductive system.” asymptotically optimal codes of Rice and the optimal Main results include a general formulation of a instantaneous Huffman source code. communication system with ways to measure the amount of Incorporation of information theoretic ideas into actual information generated by the source and the capacity of the systems was limited by complexity and latency constraints. noisy channel. H = −∑ p( x ) log p( x ) Bello, Fortune (1953)
Rice, Bell System Technical Journal (1950)
Entropy, amount of information generated by a source
C = W log( 1 + NP ) Channel capacity of an additive white Gaussian noise channel.
Information can be sent with arbitrarily small error if the source entropy H is less than the channel capacity C. Popular conceptions of information theory:
Feldman, Bell Laboratories Record (1953)
Bello, Fortune (1953)
Bello, Fortune (1953)
Feldman, Bell Laboratories Record (1953)
Mathematicians’ Conceptions of Information Theory At first there was very little interest in information theory and doubt into its importance. The developing relationship between information theory and algebraic coding theory established information theory as a true mathematical discipline in the eyes of mathematicians. After a few years, mathematicians developed proofs more satisfactory to them, adding rigor to what was viewed as an important yet incomplete engineer’s sketch.
Conclusions The meanings of information theory adopted by the various social groups during the formative period defined research directions at the time and for decades thereafter. Mutual interaction between engineering theorists and mathematicians led to developments in information theory that are rigorous, practical, and of importance in the design of electronic communications technology. 2004 IEEE Conference on the History of Electronics, Bletchley Park, England, June 28-30, 2004