Muutke küpsiste eelistusi

E-raamat: Information-Theoretic Aspects of Neural Networks

(Florida Atlantic University, Boca Raton, Florida, USA)
  • Formaat: 416 pages
  • Ilmumisaeg: 23-Sep-2020
  • Kirjastus: CRC Press Inc
  • Keel: eng
  • ISBN-13: 9781000141252
  • Formaat - EPUB+DRM
  • Hind: 227,50 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Lisa ostukorvi
  • Lisa soovinimekirja
  • See e-raamat on mõeldud ainult isiklikuks kasutamiseks. E-raamatuid ei saa tagastada.
  • Raamatukogudele
  • Formaat: 416 pages
  • Ilmumisaeg: 23-Sep-2020
  • Kirjastus: CRC Press Inc
  • Keel: eng
  • ISBN-13: 9781000141252

DRM piirangud

  • Kopeerimine (copy/paste):

    ei ole lubatud

  • Printimine:

    ei ole lubatud

  • Kasutamine:

    Digitaalõiguste kaitse (DRM)
    Kirjastus on väljastanud selle e-raamatu krüpteeritud kujul, mis tähendab, et selle lugemiseks peate installeerima spetsiaalse tarkvara. Samuti peate looma endale  Adobe ID Rohkem infot siin. E-raamatut saab lugeda 1 kasutaja ning alla laadida kuni 6'de seadmesse (kõik autoriseeritud sama Adobe ID-ga).

    Vajalik tarkvara
    Mobiilsetes seadmetes (telefon või tahvelarvuti) lugemiseks peate installeerima selle tasuta rakenduse: PocketBook Reader (iOS / Android)

    PC või Mac seadmes lugemiseks peate installima Adobe Digital Editionsi (Seeon tasuta rakendus spetsiaalselt e-raamatute lugemiseks. Seda ei tohi segamini ajada Adober Reader'iga, mis tõenäoliselt on juba teie arvutisse installeeritud )

    Seda e-raamatut ei saa lugeda Amazon Kindle's. 

Information theoretics vis-a-vis neural networks generally embodies parametric entities and conceptual bases pertinent to memory considerations and information storage, information-theoretic based cost-functions, and neurocybernetics and self-organization. Existing studies only sparsely cover the entropy and/or cybernetic aspects of neural information.
Information-Theoretic Aspects of Neural Networks cohesively explores this burgeoning discipline, covering topics such as:
  • Shannon information and information dynamics
  • neural complexity as an information processing system
  • memory and information storage in the interconnected neural web
  • extremum (maximum and minimum) information entropy
  • neural network training
  • non-conventional, statistical distance-measures for neural network optimizations
  • symmetric and asymmetric characteristics of information-theoretic error-metrics
  • algorithmic complexity based representation of neural information-theoretic parameters
  • genetic algorithms versus neural information
  • dynamics of neurocybernetics viewed in the information-theoretic plane
  • nonlinear, information-theoretic transfer function of the neural cellular units
  • statistical mechanics, neural networks, and information theory
  • semiotic framework of neural information processing and neural information flow
  • fuzzy information and neural networks
  • neural dynamics conceived through fuzzy information parameters
  • neural information flow dynamics
  • informatics of neural stochastic resonance
    Information-Theoretic Aspects of Neural Networks acts as an exceptional resource for engineers, scientists, and computer scientists working in the field of artificial neural networks as well as biologists applying the concepts of communication theory and protocols to the functioning of the brain. The information in this book explores new avenues in the field and creates a common platform for analyzing the neural complex as well as artificial neural networks.
  • Introduction
    Neuroinformatics
    2(13)
    Information-Theoretic Framework of Neural Networks
    15(5)
    Entropy, Thermodynamics and Information Theory
    20(2)
    Information-Theoretics and Neural Network Training
    22(16)
    Dynamics of Neural Learning in the Information-Theoretic Plane
    38(2)
    Neural Nonlinear Activity in the Information-Theoretic Plane
    40(1)
    Degree of Neural Complexity and Maximum Entropy
    41(2)
    Concluding Remarks
    43(28)
    Bibliography
    45(8)
    Appendix 1.1
    53(8)
    Appendix 1.2
    61(8)
    Appendix 1.3
    69(2)
    Neural Complex: A Nonlinear C3I System
    Introduction
    71(1)
    Neural Information Processing: C3I Protocols
    72(4)
    Nonlinear Neural Activity
    76(4)
    Bernoulli-Riccati Equations
    80(3)
    Nonlinear Neural Activity: Practical Considerations
    83(5)
    Entropy/Information Across a Neural Nonlinear Process
    88(1)
    Nonsigmoidal Activation Functions
    88(1)
    Definitions and Lemmas on Certain Classes of Nonlinear Functions
    89(2)
    Concluding remarks
    91(22)
    Bibliography
    95(4)
    Appendix 2.1
    99(6)
    Appendix 2.2
    105(8)
    Nonlinear and Informatic Aspects of Fuzzy Neural Activity
    Introduction
    113(1)
    What is Fuzzy Activity
    113(2)
    Crisp Set versus Fuzzy Sets
    115(1)
    Membership Attributes to a Fuzzy Set
    116(3)
    Fuzzy Neural Activity
    119(1)
    Fuzzy Differential Equations
    120(6)
    Membership Attributions to Fuzzy Sets via Activation Function
    126(4)
    Neural Architecture with a Fuzzy Sigmoid
    130(4)
    Fuzzy Considerations, Uncertainty and Information
    134(1)
    Information-Theoretics of Crisp and Fuzzy Sets
    135(6)
    Fuzzy Neuroinformatics
    141(6)
    Concluding Remarks
    147(6)
    Csiszar's Information-Theoretic Error-Measures for Neural Network Optimizations
    Introduction
    153(1)
    Disorganization and Entropy Considerations in Neural Networks
    154(4)
    Information-Theoretic Error Measures
    158(12)
    Neutral Nonlinear Response versus Optimization Algorithms
    170(3)
    A Multilayer Perceptron Training with Information-Theoretic Cost-Functions
    173(3)
    Results on Neural Network Training with Csiszar's Error-Measures
    176(36)
    Generalized Csiszar's Symmetrized Error-Measures
    212(10)
    Concluding Remarks
    222(11)
    Bibliography
    229(4)
    Dynamics of Neural Learning in the Information-Theoretic Plane
    Introduction
    233(4)
    Stochastical Neural Dynamics
    237(1)
    Stochastical Dynamics of the Error-Measures (ε)
    238(1)
    Random-Walk Paradigm of ε(t) Dynamics
    239(4)
    Evolution of ε(t): Representation via the Fokker-Planck Equation
    243(2)
    Logistic Growth Model of ε(t)
    245(4)
    Convergence Considerations
    249(8)
    Further Considerations on the Dynamics of ε(t)
    257(6)
    Dynamics of Fuzzy Uncertainty
    263(4)
    Concluding Remarks
    267(4)
    Bibliography
    269(2)
    Informatic Perspectives of Complexity Measure in Neural Networks
    Introduction
    271(1)
    Neural Complexity
    271(8)
    Complexity Measure
    279(5)
    Neural Networks: Simple and Complex
    284(2)
    Neural Complexity versus Neural Entropy
    286(3)
    Neural Network Training via Complexity Parameter
    289(2)
    Calculation of μTi and σTi
    291(2)
    Perceptron Training: Simulated Results
    293(4)
    Concluding Remarks
    297(8)
    Bibliography
    299(2)
    Appendix 6.1
    301(4)
    Information-Theoretic Aspects of Neural Stochastic Resonance
    Introduction
    305(8)
    Inter-Event Histograms (IIH) and Stochastic Resonance
    313(4)
    A Neural Network under SR-Based Learning
    317(4)
    Simulation Results
    321(3)
    Concluding Remarks
    324(7)
    Bibliography
    327(4)
    Neural Informatics and Genetic Algorithms
    Entropy, Thermobiodynamics and Bioinformatics
    331(11)
    Genetic Code
    342(2)
    Search Algorithms
    344(14)
    Simple Genetic Algorith (SGA)
    358(5)
    Genetic Algorithms and Neural Networks
    363(7)
    Information-Theoretics of Genetic Selection Algorithm
    370(4)
    A Test ANN Architecture Deploying GA and IT Concepts
    374(5)
    Description of the Algorithm
    379(5)
    Experimental Simulations
    384(9)
    Concluding Remarks
    393(4)
    Bibliography
    395(2)
    Subject Index 397
    P. S. Neelakanta