Algorithmic Information Theory
Information Theory Accomplishment | 1966
IBM researcher: Gregory Chaitin
Where the work was done: IBM T.J. Watson Research Center
What we accomplished: Gregory Chaitin (pictured) made significant contributions to algorithmic information theory, which offers a third meaning of entropy, complementary to the statistical entropy of Shannon and the thermodynamic entropy of Clausius and Kelvin. Alone of the three approaches, algorithmic information theory allows one to speak meaningfully about the randomness of individual finite objects, such as bit strings, the information content of a bit string x being defined as the size in bits of the smallest program causing a standard universal computer U to produce exactly x as output, and an algorithmically random string being one that is incompressible, i.e. cannot be computed by a U-program smaller than itself. While still in high school, Chaitin discovered some of the basic principles of algorithmic information theory, independently of the theory's other founders Kolmogorov and Solomonoff. After coming to IBM Research he continued working in the field, with two important papers in 1975, one of which introduced and proved the properties of the uncomputable irrational number now called Chaitin's Omega, defined as the halting probability of the universal computer on a random input. The other used algorithmic information to produce an easy form of Gödel's incompleteness theorem, showing that the statement "x is incompressible" is true but unprovable for most strings x.
Image credit: New Scientist