|
|
|
|
3 | (166) |
|
1.1 Lecture on the Weak Capacity of Averaged Channels |
|
|
3 | (15) |
|
|
3 | (1) |
|
|
4 | (1) |
|
1.1.3 A Channel Without Strong Capacity |
|
|
5 | (4) |
|
1.1.4 The Weak Capacity of an Averaged Discrete Channel |
|
|
9 | (2) |
|
1.1.5 The Weak Capacity of an Averaged Semi-continuous Channel |
|
|
11 | (1) |
|
1.1.6 Nonstationary Averaged Channels |
|
|
12 | (2) |
|
1.1.7 Averages of Channels with Respect to General Probability Distributions |
|
|
14 | (4) |
|
1.2 Lecture on Further Results for Averaged Channels Including C(λ)-Capacity, Side Information, Effect of Memory, and a Related, Not Familiar Optimistic Channel |
|
|
18 | (19) |
|
1.2.1 Averaged Channels Where Either the Sender or the Receiver Knows the Individual Channel Which Governs the Transmission |
|
|
19 | (15) |
|
1.2.2 Another Channel: The Optimistic Channel |
|
|
34 | (3) |
|
1.3 Lecture on The Structure of Capacity Functions for Compound Channels |
|
|
37 | (24) |
|
1.3.1 Definitions and Introduction of the Capacity Functions C(λ), C(λR), C(λR) |
|
|
37 | (3) |
|
|
40 | (3) |
|
1.3.3 The Structure of C(λ) |
|
|
43 | (8) |
|
1.3.4 The Relationship of C(λR), C(λR), and C(λ) |
|
|
51 | (4) |
|
1.3.5 Evaluation of C(λ) in Several Examples |
|
|
55 | (6) |
|
1.4 Lecture on Algebraic Compositions (Rings and Lattices) of Channels |
|
|
61 | (13) |
|
|
62 | (3) |
|
1.4.2 Min Channels Known as Compound Channels CC |
|
|
65 | (1) |
|
1.4.3 Compound Channels Treated with Maximal Coding |
|
|
66 | (3) |
|
1.4.4 For Comparison: The Method of Random Codes |
|
|
69 | (2) |
|
|
71 | (2) |
|
1.4.6 The ΔC with Partial Knowledge of the Sender |
|
|
73 | (1) |
|
1.5 Lecture on Discrete Finite State Channels |
|
|
74 | (26) |
|
1.5.1 Definitions and Examples |
|
|
74 | (2) |
|
1.5.2 Two Performance Criteria: Lower and Upper Maximal Information Rates for the FSC |
|
|
76 | (1) |
|
1.5.3 Two Further Instructive Examples of Channels |
|
|
77 | (1) |
|
1.5.4 Independently Selected States |
|
|
78 | (4) |
|
1.5.5 The Finite State Channel with State Calculable by both Sender and Receiver |
|
|
82 | (1) |
|
1.5.6 Stochastically Varying Channels |
|
|
83 | (2) |
|
1.5.7 Random Codes and Weakly Varying Channels |
|
|
85 | (9) |
|
|
94 | (1) |
|
1.5.9 Sources with Arbitrarily Varying Letter Probabilities |
|
|
95 | (3) |
|
1.5.10 Channels with Varying Transmission Probabilities |
|
|
98 | (2) |
|
1.6 Lecture on Gallager's Converse for General Channels Including DFMC and FSC |
|
|
100 | (18) |
|
1.6.1 Lower Bounding Error Probabilities of Digits |
|
|
100 | (4) |
|
|
104 | (2) |
|
1.6.3 Indecomposable Channels |
|
|
106 | (4) |
|
1.6.4 A Lower Bound on the Probability of Decoding Error for the FSC |
|
|
110 | (8) |
|
1.7 Lecture on Information and Control: Matching Channels |
|
|
118 | (51) |
|
1.7.1 New Concepts and Results |
|
|
118 | (5) |
|
1.7.2 Definitions, Known Facts, and Abbreviations |
|
|
123 | (2) |
|
1.7.3 The Deterministic Matching Channel and Matching in Products of Bipartite Graphs |
|
|
125 | (3) |
|
1.7.4 Main Results on Matching in Products of Bipartite Graphs |
|
|
128 | (4) |
|
1.7.5 Matching in Products of Non-identical Bipartite Graphs |
|
|
132 | (2) |
|
1.7.6 An Exact Formula for the Matching Number of Powers of "Stared" Bipartite Graphs |
|
|
134 | (2) |
|
1.7.7 Two Examples Illustrating the Significance of Theorems 45 and 46 |
|
|
136 | (1) |
|
1.7.8 Multi-way Deterministic Matching Channels |
|
|
137 | (8) |
|
1.7.9 The Controller Falls Asleep---on Matching Zero-Error Detection Codes |
|
|
145 | (2) |
|
1.7.10 The Matching Zero-Error Detection Capacity Cmde in a Genuine Example |
|
|
147 | (1) |
|
1.7.11 Feedback and also Randomization Increase the Capacity of the Matching Channel |
|
|
148 | (1) |
|
1.7.12 The Capacity for Matching Zero-Error Detection Codes with Feedback (MDCF) for W0 |
|
|
149 | (2) |
|
1.7.13 Identification for Matching Channels |
|
|
151 | (5) |
|
1.7.14 Zero-Error Detection |
|
|
156 | (4) |
|
1.7.15 A Digression: Three Further Code Concepts |
|
|
160 | (5) |
|
|
165 | (4) |
|
2 Algorithms for Computing Channel Capacities and Rate-Distortion Functions |
|
|
169 | (20) |
|
2.1 Lecture on Arimoto's Algorithm for Computing the Capacity of a DMC |
|
|
169 | (9) |
|
2.1.1 Mutual Information and Equivocation |
|
|
170 | (1) |
|
2.1.2 The Algorithm and Its Convergence |
|
|
171 | (2) |
|
2.1.3 The Convergence of the Algorithm |
|
|
173 | (2) |
|
2.1.4 Speed of Convergence |
|
|
175 | (2) |
|
2.1.5 Upper and Lower Bounds on the Capacity |
|
|
177 | (1) |
|
2.2 Lecture on Blahut's Algorithm for Computing Rate-Distortion Functions |
|
|
178 | (4) |
|
2.2.1 Basic Definitions and the Algorithm |
|
|
178 | (2) |
|
2.2.2 Convergence of the Algorithm |
|
|
180 | (2) |
|
2.3 Lecture on a Unified Treatment of the Computation of the Δ-distortion of a DMS and the Capacity of a DMS under Input Cost Constraint |
|
|
182 | (7) |
|
|
182 | (1) |
|
2.3.2 The Computation of G(s) |
|
|
183 | (2) |
|
2.3.3 Capacity Computing Algorithm |
|
|
185 | (2) |
|
|
187 | (2) |
|
3 Shannon's Model for Continuous Transmission |
|
|
189 | (80) |
|
3.1 Lecture on Historical Remarks |
|
|
189 | (4) |
|
3.2 Lecture on Fundamental Theorems of Information Theory |
|
|
193 | (18) |
|
|
193 | (3) |
|
3.2.2 Methods to Construct Sources |
|
|
196 | (1) |
|
3.2.3 The Ergodic Theorem in Hilbert Space |
|
|
197 | (4) |
|
3.2.4 The Theorem of McMillan |
|
|
201 | (9) |
|
|
210 | (1) |
|
3.3 Lecture on Stationary Channels |
|
|
211 | (6) |
|
3.3.1 A Concept of Channels |
|
|
211 | (3) |
|
3.3.2 Methods for the Construction of Channels |
|
|
214 | (1) |
|
|
215 | (2) |
|
3.4 Lecture on "Informational Capacity" and "Error Capacity" for Ergodic Channels |
|
|
217 | (5) |
|
3.4.1 Definition of the "Informational Capacity" |
|
|
217 | (3) |
|
|
220 | (2) |
|
3.5 Lecture on Another Definition of Capacity |
|
|
222 | (7) |
|
3.6 Lecture on Still Another Type of Operational Capacities---Risk as Performance Criterium |
|
|
229 | (2) |
|
|
229 | (1) |
|
3.6.2 Nedoma's Fundamental Paper Has Three Basic Ideas |
|
|
230 | (1) |
|
3.6.3 Report on the Results |
|
|
230 | (1) |
|
|
231 | (1) |
|
3.7 Lecture on the Discrete Finite-Memory Channel According to Feinstein and Wolfowitz |
|
|
231 | (4) |
|
|
233 | (2) |
|
3.8 Lecture on the Extension of the Shannon/McMillan Theorem by Jacobs |
|
|
235 | (2) |
|
3.9 Lecture on Achieving Channel Capacity in DFMC |
|
|
237 | (2) |
|
|
237 | (1) |
|
3.9.2 A Topology on the Linear Space L = LP(Ω, B) of σ-Additive Finite Real-Valued Functions h: B → R |
|
|
237 | (1) |
|
3.9.3 R(P) Is an Upper Semi-continuous Functional (u.s.c.) for DFMC |
|
|
238 | (1) |
|
3.10 Lecture on the Structure of the Entropy Rate of a Discrete, Stationary Source with Finite Alphabet (DSS) |
|
|
239 | (4) |
|
3.11 Lecture on the Transmission of Bernoulli Sources Over Stationary Channels |
|
|
243 | (5) |
|
|
243 | (1) |
|
|
244 | (1) |
|
3.11.3 The Source Channel Hook-up |
|
|
244 | (4) |
|
3.12 Lecture on Block Coding for Weakly Continuous Channels |
|
|
248 | (4) |
|
|
248 | (1) |
|
|
249 | (2) |
|
|
251 | (1) |
|
3.12.4 Every Channel Is Almost Weakly Continuous |
|
|
252 | (1) |
|
3.13 Lecture on Zero-Error Transmission Over a DMC Requires Infinite Expected Coding Time |
|
|
252 | (5) |
|
|
252 | (1) |
|
3.13.2 A Class of Practical Codes |
|
|
252 | (1) |
|
3.13.3 Zero-Error Transmission of a Source Over a Channel |
|
|
253 | (1) |
|
3.13.4 Structure of Codes in Zero-Error Hook-up |
|
|
254 | (1) |
|
3.13.5 An Application to the Theory of Isomorphism of Stationary Processes |
|
|
255 | (2) |
|
3.14 Lecture on Zero-Error Stationary Coding Over Stationary Channels |
|
|
257 | (4) |
|
3.14.1 Capacities of Gray, Ornstein, and Dobrushin and of Kieffer: Introduction |
|
|
257 | (1) |
|
|
258 | (1) |
|
3.14.3 Synchronization Words |
|
|
259 | (2) |
|
3.15 Lecture on Blackwell's Infinite Codes for the DMC |
|
|
261 | (8) |
|
3.15.1 Introduction and Result |
|
|
261 | (1) |
|
3.15.2 A More Mathematical Formulation |
|
|
261 | (1) |
|
3.15.3 A Technical Auxiliary Result |
|
|
262 | (2) |
|
|
264 | (5) |
|
|
269 | (30) |
|
4.1 Lecture on Sliding-Block Joint Source/Noisy Channel Coding Theorems |
|
|
269 | (5) |
|
|
269 | (1) |
|
4.1.2 Sliding-Block Coding for a DMC |
|
|
270 | (2) |
|
4.1.3 Sketch of the Proof of the Direct Part---the Basic Ideas |
|
|
272 | (2) |
|
4.2 Lecture on Block Synchronization Coding and Applications |
|
|
274 | (13) |
|
|
274 | (3) |
|
4.2.2 Sources, Channels, and Codes |
|
|
277 | (2) |
|
4.2.3 Statement and Discussion of Results |
|
|
279 | (8) |
|
4.3 Lecture on Sliding-Block Coding |
|
|
287 | (12) |
|
|
287 | (1) |
|
|
288 | (3) |
|
4.3.3 Some Remarks on Periodic Sources |
|
|
291 | (3) |
|
4.3.4 Sliding-Block Coding for Weakly Continuous Channels |
|
|
294 | (2) |
|
|
296 | (3) |
|
5 On λ-Capacities and Information Stability |
|
|
299 | (24) |
|
5.1 Lecture on a Report on Work of Parthasarathy and of Kieffer on λ--Capacities |
|
|
299 | (2) |
|
5.1.1 Historical Remarks and Preliminaries |
|
|
299 | (2) |
|
5.2 Asymptotic Properties of the Function N(n, λ-) |
|
|
301 | (1) |
|
5.3 Channels with Additive Noise |
|
|
301 | (2) |
|
5.4 A General Formula for the Capacity of Stationary Nonanticipatory Channels |
|
|
303 | (8) |
|
5.4.1 The Shannon--McMillan Theorem |
|
|
305 | (6) |
|
5.5 Lecture on Information Density or Per Letter Information |
|
|
311 | (3) |
|
5.6 Lecture on Information Stability |
|
|
314 | (1) |
|
5.7 Lecture on Transmission of Messages over Channels |
|
|
315 | (2) |
|
|
315 | (2) |
|
|
317 | (6) |
|
|
320 | (3) |
|
6 Channels with Infinite Alphabets |
|
|
323 | (42) |
|
6.1 Lecture on Introduction |
|
|
323 | (1) |
|
6.2 Lecture on Basic Concepts and Observations |
|
|
324 | (3) |
|
|
324 | (1) |
|
6.2.2 Finite Length Codes and Compactness |
|
|
325 | (2) |
|
6.2.3 Convexity and Continuity Properties of N(K, λ-) |
|
|
327 | (1) |
|
6.3 Infinite λ-codes for Product Channels |
|
|
327 | (3) |
|
6.4 Bounds on N{P, λ-) in Terms of Stochastic Inequalities |
|
|
330 | (5) |
|
6.5 Lecture on Coding Theorem, Weak and Strong Converses for Stationary Channels |
|
|
335 | (4) |
|
6.6 Sharper Estimates for Channels with Moment Conditions |
|
|
339 | (6) |
|
6.7 A Necessary and Sufficient Condition for the Strong Converse in the Stationary Case |
|
|
345 | (6) |
|
6.8 Lecture on Memoryless Stationary Channels with Constrained Inputs |
|
|
351 | (14) |
|
6.8.1 Channels with Additive Noise and Additive Gaussian Noise |
|
|
353 | (2) |
|
6.8.2 Varying Channels in the Semi-Continuous Case |
|
|
355 | (2) |
|
6.8.3 Average Error Versus Maximal Error---Two Theories of Coding |
|
|
357 | (1) |
|
6.8.4 The Contraction Channel K |
|
|
358 | (1) |
|
6.8.5 An Upper Bound for the Maximal Error Capacity Region of K |
|
|
359 | (2) |
|
|
361 | (4) |
|
|
|
7 Selected Topics of Information Theory and Mathematical Statistics |
|
|
365 | (28) |
|
7.1 Lecture on Hypotheses Testing, Bayes Estimates, and Fisher Information |
|
|
365 | (7) |
|
|
365 | (1) |
|
|
366 | (3) |
|
7.1.3 Construction of Almost Surely Reliable Decision Functions |
|
|
369 | (1) |
|
|
369 | (3) |
|
7.2 Lecture on Information-Theoretical Approach to Expert Systems |
|
|
372 | (21) |
|
7.2.1 Basic Properties of the Relative Entropy |
|
|
372 | (6) |
|
7.2.2 Minimizing the Relative Entropy Under Linear Constraints |
|
|
378 | (5) |
|
7.2.3 Axiomatic Results on Inference for Inverse Problems |
|
|
383 | (7) |
|
|
390 | (3) |
|
8 β-Biased Estimators in Data Compression |
|
|
393 | (56) |
|
8.1 Lecture on Introduction |
|
|
393 | (2) |
|
|
395 | (1) |
|
8.2 Lecture on Models of Data Compression |
|
|
395 | (4) |
|
|
395 | (1) |
|
|
396 | (3) |
|
8.3 Lecture on Estimators |
|
|
399 | (3) |
|
8.3.1 Estimators and Data Compression |
|
|
399 | (1) |
|
8.3.2 Criteria for Estimators |
|
|
400 | (2) |
|
8.4 Lecture on a Suitable Loss Function and the Class of β-Biased Estimators |
|
|
402 | (6) |
|
8.4.1 Derivation of the Loss Function |
|
|
402 | (1) |
|
8.4.2 Discussion of an Alternative Loss Function |
|
|
403 | (1) |
|
8.4.3 Class of β-Biased Estimators |
|
|
404 | (1) |
|
8.4.4 Application of β-Biased Estimators to Different Criteria |
|
|
404 | (4) |
|
8.5 Lecture on Derivation of β-Biased Estimators |
|
|
408 | (9) |
|
8.5.1 Laplace's Estimator |
|
|
408 | (1) |
|
|
409 | (1) |
|
8.5.3 Generalization of Bayes' Rule to Continuous Parameters and n Observations |
|
|
410 | (1) |
|
8.5.4 Bayesian Estimators |
|
|
411 | (1) |
|
8.5.5 Application of a Bayesian Estimator to a Binary Source |
|
|
412 | (1) |
|
8.5.6 Admissibility of β-Biased Estimators |
|
|
413 | (4) |
|
8.6 Lecture on Approximations for the Redundancy |
|
|
417 | (3) |
|
|
420 | (2) |
|
8.7.1 Preview of the Main Theorems |
|
|
420 | (1) |
|
8.7.2 Sketch of the Proofs |
|
|
421 | (1) |
|
8.8 Lecture on the Mathematical Tools |
|
|
422 | (16) |
|
8.8.1 Gilbert's Corollary |
|
|
436 | (2) |
|
8.9 Lecture on Main Results |
|
|
438 | (5) |
|
8.10 Lecture on Distributions |
|
|
443 | (6) |
|
8.10.1 The Poisson Distribution |
|
|
443 | (1) |
|
8.10.2 The Beta Distribution |
|
|
443 | (1) |
|
8.10.3 The Dirichlet Distribution |
|
|
444 | (2) |
|
8.10.4 The Central Moments |
|
|
446 | (1) |
|
|
447 | (1) |
|
|
448 | (1) |
About the Author |
|
449 | (4) |
Comments |
|
453 | (2) |
|
Notations |
|
455 | (2) |
Author Index |
|
457 | (2) |
Subject Index |
|
459 | |