Introductory Information Theory


The module is available as a YouTube playlist and as a short course with quizzes on Learning Hub (syllabus shown to the right)

YouTube

Overview

This short course provides insight into Shannon's methods behind measuring communication channel capacity and reliability including coding, entropy, and multiple channels.

Recommended for those that have had some probability theory/statistics.

Estimated time required: 2 hours per week.

Author

Mark Daniel Ward
Associate Professor
Statistics
Purdue University

Syllabus/Suggested Schedule

Coding
Entropy with Variance
Huffman Coding
Lempel-Ziv Coding
Shannon's First Theorem
Kraft Inequality
Channels
Conditional & Joint Entropy Part 1
Conditional & Joint Entropy Part 2
Mutual Information