ECE 6520 Information Theory (Spring 2021)
Introduction of fundamental information theoretic quantities, their relations and basic inequalities
Syllabus
Introduction
What is information?
How do we measure information?
Example of entropy, self information and properties of entropy
Joint self information and joint entropy
Conditional self information, conditional entropy and chain rule
Mutual information
Relative entropy
More chain rule and more on mutual information
Four important inequalities
Jensen's inequality
Jensen's inequality
Log sum inequality
Homework 1:
Chapter 2: 2.1, 2.2, 2.3, 2.4, 2.5, 2.6, 2.7, 2.9, 2.12, 2.14, 2.15, 2.16, 2.19, 2.20, 2.25, 2.29
Data Processing Inequality
Fano's inequality
Information Source and Asymptotic Equipartition Property (AEP)
Typical sequences and typical sets
Asymptotic Equipartition Property (AEP)
Data Compression (Source Coding)
Entropy rate
Sources with memory
Markov Chains
Entropy Rate
Data Compression
Sources with memory
Hidden Markov Processes
AEP for stationary ergodic processes.
Lossless source code
Introduction
Fixedtovariable length code
Uniquely decodable and prefix code
Homework 1
Uniquely decodable and prefix code
Kraft inequality
Optimal codes
Bounds on the optimal code length
Shannon Codes
Huffman Codes
Optimality of Huffman Codes
Homework 2:
Chapter 3: 3.2, 3.9
Chapter 4: 4.1, 4.3, 4.6, 4.7, 4.9, 4.11, 4.33
Chapter 5: 5.1, 5.4, 5.6, 5.8, 5.12, 5.16, 5.24, 5.25, 5.30, 5.39
Optimality of Huffman Codes
ShannonFanoElias Codes
Revisit Shannon Codes
Competitive Optimality of Shannon Codes
Universal Compression of Binary Sequences
LempelZiv (LZ78) Universal Compression
Homework 2
Homework 2
Channel Capacity
Channel Modeling
Channel Capacity
Examples
Examples of Channel Capacity
An Optimization Theorem
Jointly Typical Sequences (Joint AEP)
Proof of Channel Coding Theorem
Zeroerror Codes
Homework 3:
Chapter 7: 7.1, 7.2, 7.3, 7.4, 7.5, 7.7, 7.8. 7.9, 7.11, 7.12, 7.20, 7.35
Feedback Capacity
SourceChannel Separation Theorem
Homework 3
Differential Entropy
Relative Entropy of Continuous Random Variables
Joint and Conditional Differential Entropy
AEP for continuous Random Variables
Properties
Additive Noise
Capacity of Additive White Gaussian Noise (AWGN) Channel
Capacity of Other Channels
Homework 4:
Chapter 8: 8.1, 8.2. 8.4, 8.5, 8.8, 8.9, 8.10
Chapter 9: 9.1,9.2, 9.3, 9.5, 9.6, 9.7, 9.8
Homework 4
RateDistortion Theory
Quantization
RateDistortion Problem
Examples
Gaussian Source with SquaredError Distortion Measure
Proof of RateDistortion Theorem (Achievability)
Proof of RateDistortion Theorem (Converse)
The Analytical Method for Calculation of RateDistortion Function
Reference: Chapters 10.4, 10.7
Homework 5: 10.1, 10.2, 10.3, 10.4, 10.5, 10.6, 10.7, 10.11, 10.12, 10.13, 10.18, 10.19
ArimotoBlahut Algorithm
Separation Theorem
Parallel Gaussian Sources
Homework 5
Network Information Theory
SlepianWolf Coding Theorem
Conditionally typical sequences
Achievability
Converse
