Information Theory provides a mathematical model for quantifying information. Since its formal introduction in the late 1940s, Information Theory has had a profound impact on a great many scientific disciplines. In this graduate course, we will cover the basic results of Information Theory, including entropy, relative entropy, mutual information, and the asymptotic equipartition property. We will then present a wide variety of applications of Information Theory to computer science and other computational disciplines, including compression, coding, machine learning, information retrieval, statistics, computational linguistics, computational biology, wired and wireless networks, and image and speech processing. The course will be self-contained; no prior knowledge of Information Theory will be required or assumed.
This document, and all documents on this website, may be modified from time to time; be sure to reload documents on occasion and check the "last modified" date against any printed version you may have.