Muutke küpsiste eelistusi

Information Theory and Rate Distortion Theory for Communications and Compression [Pehme köide]

  • Formaat: Paperback / softback, 127 pages, kõrgus x laius: 235x191 mm, kaal: 251 g
  • Sari: Synthesis Lectures on Communications
  • Ilmumisaeg: 01-Dec-2013
  • Kirjastus: Morgan & Claypool Publishers
  • ISBN-10: 1598298070
  • ISBN-13: 9781598298079
Teised raamatud teemal:
  • Formaat: Paperback / softback, 127 pages, kõrgus x laius: 235x191 mm, kaal: 251 g
  • Sari: Synthesis Lectures on Communications
  • Ilmumisaeg: 01-Dec-2013
  • Kirjastus: Morgan & Claypool Publishers
  • ISBN-10: 1598298070
  • ISBN-13: 9781598298079
Teised raamatud teemal:
This book is very specifically targeted to problems in communications and compression by providing the fundamental principles and results in information theory and rate distortion theory for these applications and presenting methods that have proved and will prove useful in analyzing and designing real systems. The chapters contain treatments of entropy, mutual information, lossless source coding, channel capacity, and rate distortion theory; however, it is the selection, ordering, and presentation of the topics within these broad categories that is unique to this concise book. While the coverage of some standard topics is shortened or eliminated, the standard, but important, topics of the chain rules for entropy and mutual information, relative entropy, the data processing inequality, and the Markov chain condition receive a full treatment. Similarly, lossless source coding techniques presented include the Lempel-Ziv-Welch coding method. The material on rate Distortion theory and exploring fundamental limits on lossy source coding covers the often-neglected Shannon lower bound and the Shannon backward channel condition, rate distortion theory for sources with memory, and the extremely practical topic of rate distortion functions for composite sources.

The target audience for the book consists of graduate students at the master's degree level and practicing engineers. It is hoped that practicing engineers can work through this book and comprehend the key results needed to understand the utility of information theory and rate distortion theory and then utilize the results presented to analyze and perhaps improve the communications and compression systems with which they are familiar.
Preface xi
1 Communications, Compression and Fundamental Limits
1(8)
1.1 Shannon's Three Theorems
3(3)
1.2 The Information Transmission Theorem or Separation Theorem
6(1)
1.3 Notes and Additional References
7(2)
2 Entropy and Mutual Information
9(20)
2.1 Entropy and Mutual Information
9(7)
2.2 Chain Rules for Entropy and Mutual Information
16(1)
2.3 Differential Entropy and Mutual Information for Continuous Random Variables
17(8)
2.4 Relative Entropy and Mutual Information
25(2)
2.5 Data Processing Inequality
27(1)
2.6 Notes and Additional References
28(1)
3 Lossless Source Coding
29(20)
3.1 The Lossless Source Coding Problem
29(2)
3.2 Definitions, Properties, and The Source Coding Theorem
31(2)
3.3 Huffman Coding and Code Trees
33(4)
3.4 Elias Coding and Arithmetic Coding
37(3)
3.5 Lempel-Ziv Coding
40(3)
3.6 Kraft Inequality
43(3)
3.7 The AEP and Data Compression
46(2)
3.8 Notes and Additional References
48(1)
4 Channel Capacity
49(32)
4.1 The Definition of Channel Capacity
49(3)
4.2 Properties of Channel Capacity
52(1)
4.3 Calculating Capacity for Discrete Memoryless Channels
53(7)
4.4 The Channel Coding Theorem
60(2)
4.5 Decoding and Jointly Typical Sequences
62(2)
4.6 Fano's Inequality and the Converse to the Coding Theorem
64(3)
4.7 The Additive Gaussian Noise Channel and Capacity
67(3)
4.8 Converse to the Coding Theorem for Gaussian Channels
70(1)
4.9 Expressions for Capacity and the Gaussian Channel
71(7)
4.9.1 Parallel Gaussian Channels [ 4, 5]
73(3)
4.9.2 Channels with Colored Gaussian Noise [ 4, 5]
76(2)
4.10 Band-Limited Channels
78(2)
4.11 Notes and Additional References
80(1)
5 Rate Distortion Theory and Lossy Source Coding
81(22)
5.1 The Rate Distortion Function for Discrete Memoryless Sources
81(4)
5.2 The Rate Distortion Function for Continuous Amplitude Sources
85(3)
5.3 The Shannon Lower Bound and the Optimum Backward Channel
88(4)
5.3.1 Binary Symmetric Source
89(1)
5.3.2 Gaussian Source
90(2)
5.4 Stationary Gaussian Sources with Memory
92(1)
5.5 The Rate Distortion Function for a Gaussian Autoregressive Source
93(2)
5.6 Composite Source Models and Conditional Rate Distortion Functions
95(1)
5.7 The Rate Distortion Theorem for Independent Gaussian Sources-Revisited
96(2)
5.8 Applications of R(D) to Scalar Quantization
98(4)
5.9 Notes and Additional References
102(1)
A Useful Inequalities
103(2)
B Laws of Large Numbers
105(4)
B.1 Inequalities and Laws of Large Numbers
105(4)
B.1.1 Markov's Inequality
105(1)
B.1.2 Chebychev's Inequality
106(1)
B.1.3 Weak Law of Large Numbers
106(1)
B.1.4 Strong Law of Large Numbers
107(2)
C Kuhn-Tucker Conditions
109(2)
Bibliography 111(4)
Author's Biography 115