Information Theory
Information theory is a branch of applied mathematics, electrical engineering and computer science involving the quantification, storage and communication of information. Computers are information manipulation engines and the underlying theory of information is critical to many of our projects. Our discoveries have practical uses, from JPEG image encoding to compression with underlying foundations defining what meaning there is in data, how to distinguish between signal and noise in massive data, how data can be stored and transmitted efficiently and what it means for data to be unknowable and random.
- 1966: Algorithmic Information Theory
- 1976: Trellis Modulation
- 1978: Minimum Description Length
- 1979: Arithmetic Coding
- 1983: LZW Compression
- 19??: Constrained codes
- 1992: JPEG
BACK TO IBM RESEARCH ACCOMPLISHMENTS