1. Introduction
2. Probability
3. Random Objects
4. Expectation and Averages
5. Second-Order Moments
6. A Menagerie of Processes
Appendix A :Preliminaries
B: Sums and Integrals
C: Common Univariate Distributions
D: Supplementary Reading
1. Information Sources2. Entropy and Information3. The Entropy Ergodic Theorem4. Information Rates I5. Relative Entropy6. Information Rates II7. Relative Entropy Rates8. Ergodic Theorems for Densities9. Channels and Codes10. Distortion11. Source Coding Theorems12. Coding for noisy channels