Muutke küpsiste eelistusi

E-raamat: Abstract Methods In Information Theory (Second Edition)

(California State Univ, San Bernardino, Usa)
  • Formaat: 412 pages
  • Sari: Series On Multivariate Analysis 10
  • Ilmumisaeg: 09-Jun-2016
  • Kirjastus: World Scientific Publishing Co Pte Ltd
  • Keel: eng
  • ISBN-13: 9789814759250
Teised raamatud teemal:
  • Formaat - EPUB+DRM
  • Hind: 52,65 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Lisa ostukorvi
  • Lisa soovinimekirja
  • See e-raamat on mõeldud ainult isiklikuks kasutamiseks. E-raamatuid ei saa tagastada.
  • Formaat: 412 pages
  • Sari: Series On Multivariate Analysis 10
  • Ilmumisaeg: 09-Jun-2016
  • Kirjastus: World Scientific Publishing Co Pte Ltd
  • Keel: eng
  • ISBN-13: 9789814759250
Teised raamatud teemal:

DRM piirangud

  • Kopeerimine (copy/paste):

    ei ole lubatud

  • Printimine:

    ei ole lubatud

  • Kasutamine:

    Digitaalõiguste kaitse (DRM)
    Kirjastus on väljastanud selle e-raamatu krüpteeritud kujul, mis tähendab, et selle lugemiseks peate installeerima spetsiaalse tarkvara. Samuti peate looma endale  Adobe ID Rohkem infot siin. E-raamatut saab lugeda 1 kasutaja ning alla laadida kuni 6'de seadmesse (kõik autoriseeritud sama Adobe ID-ga).

    Vajalik tarkvara
    Mobiilsetes seadmetes (telefon või tahvelarvuti) lugemiseks peate installeerima selle tasuta rakenduse: PocketBook Reader (iOS / Android)

    PC või Mac seadmes lugemiseks peate installima Adobe Digital Editionsi (Seeon tasuta rakendus spetsiaalselt e-raamatute lugemiseks. Seda ei tohi segamini ajada Adober Reader'iga, mis tõenäoliselt on juba teie arvutisse installeeritud )

    Seda e-raamatut ei saa lugeda Amazon Kindle's. 

Kakihara present three of the four parts of information theory in the environment of functional analysis and probability theory. The three parts are the mathematical structure of information sources, the theory of entropy as amounts of information, and the theory of information channels. Adding several new topics for the second edition, he covers entropy, information sources, information channels, channel operators, Gaussian channels, special topics, and quantum or noncommutative information channels. Annotation ©2016 Ringgold, Inc., Portland, OR (protoview.com)

Information Theory is studied from the following points of view: (1) the theory of entropy as amount of information; (2) the mathematical structure of information sources (probability measures); and (3) the theory of information channels. Shannon entropy and Kolmogorov–Sinai entropy are defined and their basic properties are examined, where the latter entropy is extended to be a linear functional on a certain set of measures. Ergodic and mixing properties of stationary sources are studied as well as AMS (asymptotically mean stationary) sources.The main purpose of this book is to present information channels in the environment of functional analysis and operator theory as well as probability theory. Ergodic, mixing, and AMS channels are also considered in detail with some illustrations. In this second edition, channel operators are studied in many aspects, which generalize ordinary channels. Also Gaussian channels are considered in detail together with Gaussian measures on a Hilbert space. The Special Topics chapter deals with features such as generalized capacity, channels with an intermediate noncommutative system, and von Neumann algebra method for channels. Finally, quantum (noncommutative) information channels are examined in an independent chapter, which may be regarded as an introduction to quantum information theory. Von Neumann entropy is introduced and its generalization to a C*-algebra setting is given. Basic results on quantum channels and entropy transmission are also considered.
Preface to the First Edition vii
Preface to the Second Edition xi
Chapter I Entropy
1(68)
1.1 The Shannon entropy
1(11)
1.2 Conditional expectations
12(6)
1.3 The Kolmogorov-Sinai entropy
18(13)
1.4 Algebraic models
31(12)
1.5 Entropy functionals
43(10)
1.6 Relative entropy and Kullback-Leibler information
53(16)
Bibliographical notes
67(2)
Chapter II Information Sources
69(56)
2.1 Alphabet message spaces and information sources
69(5)
2.2 Ergodic theorems
74(4)
2.3 Ergodic and mixing properties
78(16)
2.4 AMS sources
94(8)
2.5 Shannon-McMillan-Breiman theorem
102(8)
2.6 Ergodic decompositions
110(4)
2.7 Entropy functionals, revisited
114(11)
Bibliographical notes
123(2)
Chapter III Information Channels
125(65)
3.1 Information channels
125(9)
3.2 Mixing channels
134(7)
3.3 Semiergodic channels
141(8)
3.4 Ergodic channels
149(6)
3.5 AMS channels
155(11)
3.6 Capacity and transmission rate
166(12)
3.7 Coding theorems
178(12)
Bibliographical notes
188(2)
Chapter IV Channel Operators
190(46)
4.1 Channel operators
190(16)
4.2 Generalized channels and topological properties
206(14)
4.3 Pseudo channel operators: General theory
220(9)
4.4 Pseudo channel operators: Topological structure
229(7)
Bibliographical notes
234(2)
Chapter V Gaussian Channels
236(49)
5.1 Probability measures on a Hilbert space
236(14)
5.2 Gaussian measures: Equivalence and singularity
250(15)
5.3 Gaussian channels
265(7)
5.4 Additive Gaussian channels
272(13)
Bibliographical notes
284(1)
Chapter VI Special Topics
285(48)
6.1 Channels with a noise source
286(7)
6.2 Channel capacity, revisited
293(6)
6.3 Measurability of channels
299(7)
6.4 Approximation of channels
306(3)
6.5 Harmonic analysis for channels
309(8)
6.6 Channels with a noncommutative intermediate system
317(6)
6.7 Von Neumann algebras generated by stochastic processes
323(10)
Bibliographical notes
331(2)
Chapter VII Quantum Channels
333(26)
7.1 Quantum entropy
333(11)
7.2 Quantum channels
344(7)
7.3 Entropy transmission
351(8)
Bibliographical notes
356(3)
References 359(18)
Glossaries of Axioms 377(6)
Indices 383(1)
Notation Index 383(7)
Author Index 390(3)
Subject Index 393