Information theory studies the fundamental limits of the representation and transmission of information. This course provides an introduction to information theory, studying fundamental concepts such as probability, information, and entropy and examining their applications in the areas of data compression, coding, communications, pattern recognition and probabilistic inference.
This is a public repository for an information theory course that has run at the ANU as a 2nd year undergraduate course in second semester since 2009. From 2014 it was double-badged as a Masters level course that contained extra material and assessment.
I developed the material for this course with Edwin Bonilla (2009–2012) and Aditya Menon (2013-). Marcus Hutter contributed guest lectures on Kolmogorov complexity and algorithmic information theory (2009-).
The material in this course is largely based on David MacKay’s excellent book Information Theory, Inference, and Learning Algorithms, which is freely available to download.
As secondary texts, we use Cover & Thomas’s Elements of Information Theory and Chris Bishop’s Pattern Recognition and Machine Learning.