Muutke küpsiste eelistusi

Abstract Methods In Information Theory [Kõva köide]

(California State Univ, San Bernardino, Usa)
  • Formaat: Hardback, 264 pages
  • Sari: Series On Multivariate Analysis 4
  • Ilmumisaeg: 18-Oct-1999
  • Kirjastus: World Scientific Publishing Co Pte Ltd
  • ISBN-10: 9810237111
  • ISBN-13: 9789810237110
Teised raamatud teemal:
  • Formaat: Hardback, 264 pages
  • Sari: Series On Multivariate Analysis 4
  • Ilmumisaeg: 18-Oct-1999
  • Kirjastus: World Scientific Publishing Co Pte Ltd
  • ISBN-10: 9810237111
  • ISBN-13: 9789810237110
Teised raamatud teemal:
Information Theory is studied from the following view points: (1) the theory of entropy as amount of information; (2) the mathematical structure of information sources (probability measures); and (3) the theory of information channels. Shannon entropy and Kolmogorov-Sinai entropy are defined and their basic properties are examined, where the latter entropy is extended to be a linear functional on a certain set of measures. Ergodic and mixing properties of stationary sources are studied as well as AMS (asymptotically mean stationary) sources. The main purpose of this book is to present information channels in the environment of real and functional analysis as well as probability theory. Ergodic channels are characterized in various manners. Mixing and AMS channels are also considered in detail with some illustrations. A few other aspects of information channels including measurability, approximation and noncommutative extensions, are also discussed.
Preface vii
Entropy
1(66)
The Shannon entropy
1(10)
Conditional expectations
11(5)
The Kolmogorov-Sinai entropy
16(13)
Algebraic models
29(12)
Entropy functionals
41(8)
Relative entropy and Kullback-Leibler information
49(18)
Bibliographical notes
63(4)
Information Sources
67(54)
Alphabet message spaces and information sources
67(4)
Ergodic theorems
71(4)
Ergodic and mixing properties
75(16)
AMS sources
91(8)
Shannon-McMillan-Breiman theorem
99(7)
Ergodic decompositions
106(4)
Entropy functionals, revisited
110(11)
Bibliographical notes
119(2)
Information Channels
121(68)
Information channels
121(8)
Channel operators
129(7)
Mixing channels
136(11)
Ergodic channels
147(9)
AMS channels
156(10)
Capacity and transmission rate
166(12)
Coding theorems
178(11)
Bibliographical notes
187(2)
Special Topics
189(36)
Channels with a noise source
189(7)
Measurability of a channel
196(6)
Approximation of channels
202(5)
Harmonic analysis for channels
207(7)
Noncommutative channels
214(11)
Bibliographical notes
222(3)
References 225(14)
Indices 239
Notation index
239(5)
Author index
244(3)
Subject index
247