Muutke küpsiste eelistusi

E-raamat: Abstract Methods In Information Theory

(California State Univ, San Bernardino, Usa)
  • Formaat: 264 pages
  • Sari: Series On Multivariate Analysis 4
  • Ilmumisaeg: 15-Oct-1999
  • Kirjastus: World Scientific Publishing Co Pte Ltd
  • Keel: eng
  • ISBN-13: 9789814495417
  • Formaat - PDF+DRM
  • Hind: 24,57 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Lisa ostukorvi
  • Lisa soovinimekirja
  • See e-raamat on mõeldud ainult isiklikuks kasutamiseks. E-raamatuid ei saa tagastada.
  • Raamatukogudele
  • Formaat: 264 pages
  • Sari: Series On Multivariate Analysis 4
  • Ilmumisaeg: 15-Oct-1999
  • Kirjastus: World Scientific Publishing Co Pte Ltd
  • Keel: eng
  • ISBN-13: 9789814495417

DRM piirangud

  • Kopeerimine (copy/paste):

    ei ole lubatud

  • Printimine:

    ei ole lubatud

  • Kasutamine:

    Digitaalõiguste kaitse (DRM)
    Kirjastus on väljastanud selle e-raamatu krüpteeritud kujul, mis tähendab, et selle lugemiseks peate installeerima spetsiaalse tarkvara. Samuti peate looma endale  Adobe ID Rohkem infot siin. E-raamatut saab lugeda 1 kasutaja ning alla laadida kuni 6'de seadmesse (kõik autoriseeritud sama Adobe ID-ga).

    Vajalik tarkvara
    Mobiilsetes seadmetes (telefon või tahvelarvuti) lugemiseks peate installeerima selle tasuta rakenduse: PocketBook Reader (iOS / Android)

    PC või Mac seadmes lugemiseks peate installima Adobe Digital Editionsi (Seeon tasuta rakendus spetsiaalselt e-raamatute lugemiseks. Seda ei tohi segamini ajada Adober Reader'iga, mis tõenäoliselt on juba teie arvutisse installeeritud )

    Seda e-raamatut ei saa lugeda Amazon Kindle's. 

Information Theory is studied from the following view points: (1) the theory of entropy as amount of information; (2) the mathematical structure of information sources (probability measures); and (3) the theory of information channels. Shannon entropy and Kolmogorov-Sinai entropy are defined and their basic properties are examined, where the latter entropy is extended to be a linear functional on a certain set of measures. Ergodic and mixing properties of stationary sources are studied as well as AMS (asymptotically mean stationary) sources. The main purpose of this book is to present information channels in the environment of real and functional analysis as well as probability theory. Ergodic channels are characterized in various manners. Mixing and AMS channels are also considered in detail with some illustrations. A few other aspects of information channels including measurability, approximation and noncommutative extensions, are also discussed.
Preface vii
Entropy
1(66)
The Shannon entropy
1(10)
Conditional expectations
11(5)
The Kolmogorov-Sinai entropy
16(13)
Algebraic models
29(12)
Entropy functionals
41(8)
Relative entropy and Kullback-Leibler information
49(18)
Bibliographical notes
63(4)
Information Sources
67(54)
Alphabet message spaces and information sources
67(4)
Ergodic theorems
71(4)
Ergodic and mixing properties
75(16)
AMS sources
91(8)
Shannon-McMillan-Breiman theorem
99(7)
Ergodic decompositions
106(4)
Entropy functionals, revisited
110(11)
Bibliographical notes
119(2)
Information Channels
121(68)
Information channels
121(8)
Channel operators
129(7)
Mixing channels
136(11)
Ergodic channels
147(9)
AMS channels
156(10)
Capacity and transmission rate
166(12)
Coding theorems
178(11)
Bibliographical notes
187(2)
Special Topics
189(36)
Channels with a noise source
189(7)
Measurability of a channel
196(6)
Approximation of channels
202(5)
Harmonic analysis for channels
207(7)
Noncommutative channels
214(11)
Bibliographical notes
222(3)
References 225(14)
Indices 239
Notation index
239(5)
Author index
244(3)
Subject index
247