Information Theory is one of the few scientific fields fortunate enough to have an identifiable beginning - Claude Shannon‘s 1948 paper. The story of the evolution of how it progressed from a single theoretical paper to a broad field that has redefined our world is a fascinating one. It provides the opportunity to study the social, political, and technological interactions that have helped guide its development and define its trajectory, and gives us insight into how a new field evolves.
We often hear Claude Shannon called the father of the Digital Age. In the beginning of his paper Shannon acknowledges the work done before him, by such pioneers as Harry Nyquist and RVL. Hartley at Bell Labs in the 1920s. Though their influence was profound, the work of those early pioneers was limited and focussed on their own particular applications. It was Shannon’s unifying vision that revolutionized communication, and spawned a multitude of communication research that we now define as the field of Information Theory.
One of those key concepts was his definition of the limit for channel capacity. Similar to Moore’s Law, the Shannon limit can be considered a self-fulfilling prophecy. It is a benchmark that tells people what can be done, and what remains to be done – compelling them to achieve it.
Information Theory was not just a product of the work of Claude Shannon. It was the result of crucial contributions made by many distinct individuals, from a variety of backgrounds, who took his ideas and expanded upon them. Indeed the diversity and directions of their perspectives and interests shaped the direction of Information Theory.