Skip to main content

Molly Jones, Shannon’s Entropy

This piece is dedicated to Claude Shannon, the father of information theory. Shannon’s entropy (not the same as physical entropy) is a metric of the minimum amount of storage (fewest bits) required to store and transmit the information present in a set of data. His idea that information can be quantified drew us into the information age. Shannon’s Entropy uses analog information storage and calculation devices to make sounds. The typewriter (1808, Italy), the slide rule (1622, England), and the abacus (so old nobody knows, China/Babylonia/India/Egypt/Mesopotamia) feature, as does paper (~100 BC, China).