Information Theory       


Information theory is a branch of applied mathematics, electrical engineering and computer science involving the quantification, storage and communication of information.  Computers are information manipulation engines and the underlying theory of information is critical to many of our projects. Our discoveries have practical uses, from JPEG image encoding to compression with underlying foundations defining what meaning there is in data, how to distinguish between signal and noise in massive data, how data can be stored and transmitted efficiently and what it means for data to be unknowable and random.

 

  1. 1966:  Algorithmic Information Theory
  2. 1976:  Trellis Modulation
  3. 1978:  Minimum Description Length
  4. 1979:  Arithmetic Coding
  5. 1983:  LZW Compression
  6. 19??:  Constrained codes [no content]
  7. 1992:  JPEG

 

BACK TO IBM RESEARCH ACCOMPLISHMENTS