Preface |
|
xi | |
|
1 Communications, Compression and Fundamental Limits |
|
|
1 | (8) |
|
1.1 Shannon's Three Theorems |
|
|
3 | (3) |
|
1.2 The Information Transmission Theorem or Separation Theorem |
|
|
6 | (1) |
|
1.3 Notes and Additional References |
|
|
7 | (2) |
|
2 Entropy and Mutual Information |
|
|
9 | (20) |
|
2.1 Entropy and Mutual Information |
|
|
9 | (7) |
|
2.2 Chain Rules for Entropy and Mutual Information |
|
|
16 | (1) |
|
2.3 Differential Entropy and Mutual Information for Continuous Random Variables |
|
|
17 | (8) |
|
2.4 Relative Entropy and Mutual Information |
|
|
25 | (2) |
|
2.5 Data Processing Inequality |
|
|
27 | (1) |
|
2.6 Notes and Additional References |
|
|
28 | (1) |
|
|
29 | (20) |
|
3.1 The Lossless Source Coding Problem |
|
|
29 | (2) |
|
3.2 Definitions, Properties, and The Source Coding Theorem |
|
|
31 | (2) |
|
3.3 Huffman Coding and Code Trees |
|
|
33 | (4) |
|
3.4 Elias Coding and Arithmetic Coding |
|
|
37 | (3) |
|
|
40 | (3) |
|
|
43 | (3) |
|
3.7 The AEP and Data Compression |
|
|
46 | (2) |
|
3.8 Notes and Additional References |
|
|
48 | (1) |
|
|
49 | (32) |
|
4.1 The Definition of Channel Capacity |
|
|
49 | (3) |
|
4.2 Properties of Channel Capacity |
|
|
52 | (1) |
|
4.3 Calculating Capacity for Discrete Memoryless Channels |
|
|
53 | (7) |
|
4.4 The Channel Coding Theorem |
|
|
60 | (2) |
|
4.5 Decoding and Jointly Typical Sequences |
|
|
62 | (2) |
|
4.6 Fano's Inequality and the Converse to the Coding Theorem |
|
|
64 | (3) |
|
4.7 The Additive Gaussian Noise Channel and Capacity |
|
|
67 | (3) |
|
4.8 Converse to the Coding Theorem for Gaussian Channels |
|
|
70 | (1) |
|
4.9 Expressions for Capacity and the Gaussian Channel |
|
|
71 | (7) |
|
4.9.1 Parallel Gaussian Channels [ 4, 5] |
|
|
73 | (3) |
|
4.9.2 Channels with Colored Gaussian Noise [ 4, 5] |
|
|
76 | (2) |
|
4.10 Band-Limited Channels |
|
|
78 | (2) |
|
4.11 Notes and Additional References |
|
|
80 | (1) |
|
5 Rate Distortion Theory and Lossy Source Coding |
|
|
81 | (22) |
|
5.1 The Rate Distortion Function for Discrete Memoryless Sources |
|
|
81 | (4) |
|
5.2 The Rate Distortion Function for Continuous Amplitude Sources |
|
|
85 | (3) |
|
5.3 The Shannon Lower Bound and the Optimum Backward Channel |
|
|
88 | (4) |
|
5.3.1 Binary Symmetric Source |
|
|
89 | (1) |
|
|
90 | (2) |
|
5.4 Stationary Gaussian Sources with Memory |
|
|
92 | (1) |
|
5.5 The Rate Distortion Function for a Gaussian Autoregressive Source |
|
|
93 | (2) |
|
5.6 Composite Source Models and Conditional Rate Distortion Functions |
|
|
95 | (1) |
|
5.7 The Rate Distortion Theorem for Independent Gaussian Sources-Revisited |
|
|
96 | (2) |
|
5.8 Applications of R(D) to Scalar Quantization |
|
|
98 | (4) |
|
5.9 Notes and Additional References |
|
|
102 | (1) |
|
|
103 | (2) |
|
|
105 | (4) |
|
B.1 Inequalities and Laws of Large Numbers |
|
|
105 | (4) |
|
B.1.1 Markov's Inequality |
|
|
105 | (1) |
|
B.1.2 Chebychev's Inequality |
|
|
106 | (1) |
|
B.1.3 Weak Law of Large Numbers |
|
|
106 | (1) |
|
B.1.4 Strong Law of Large Numbers |
|
|
107 | (2) |
|
|
109 | (2) |
Bibliography |
|
111 | (4) |
Author's Biography |
|
115 | |