Muutke küpsiste eelistusi

E-raamat: Correlative Learning: A Basis for Brain and Adaptive Systems

(McMaster University), (Massachusetts General Hospital/Harvard Medical School), (McMaster University),
  • Formaat - PDF+DRM
  • Hind: 192,60 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Lisa ostukorvi
  • Lisa soovinimekirja
  • See e-raamat on mõeldud ainult isiklikuks kasutamiseks. E-raamatuid ei saa tagastada.
  • Raamatukogudele

DRM piirangud

  • Kopeerimine (copy/paste):

    ei ole lubatud

  • Printimine:

    ei ole lubatud

  • Kasutamine:

    Digitaalõiguste kaitse (DRM)
    Kirjastus on väljastanud selle e-raamatu krüpteeritud kujul, mis tähendab, et selle lugemiseks peate installeerima spetsiaalse tarkvara. Samuti peate looma endale  Adobe ID Rohkem infot siin. E-raamatut saab lugeda 1 kasutaja ning alla laadida kuni 6'de seadmesse (kõik autoriseeritud sama Adobe ID-ga).

    Vajalik tarkvara
    Mobiilsetes seadmetes (telefon või tahvelarvuti) lugemiseks peate installeerima selle tasuta rakenduse: PocketBook Reader (iOS / Android)

    PC või Mac seadmes lugemiseks peate installima Adobe Digital Editionsi (Seeon tasuta rakendus spetsiaalselt e-raamatute lugemiseks. Seda ei tohi segamini ajada Adober Reader'iga, mis tõenäoliselt on juba teie arvutisse installeeritud )

    Seda e-raamatut ei saa lugeda Amazon Kindle's. 

Developing brain-style signal processing or machine learning algorithms has been the Holy Grail of artificial intelligence, notes Chen (Neuroscience Statistics Research Laboratory, Harvard Medical School; RIKEN Brain Science Institute, Tokyo, Japan) in introducing eight chapters by international multidisciplinary researchers. In examining generalized concepts of correlation underlying both neural networks and complex engineering systems, they discuss the key role of correlation in normal and abnormal neural functions; correlation-based mathematical learning paradigms such as ALOPEX (Algorithm Of Pattern EXtraction); and case studies (e.g., of online training of artificial neural networks). Appendices include further details on correlation, stochastic learning rules, methods for estimating various functions and parameters, and a primer on linear algebra. The bibliography lists 1,002 references. Annotation ©2008 Book News, Inc., Portland, OR (booknews.com)

Correlative Learning: A Basis for Brain and Adaptive Systems provides a bridge between three disciplines: computational neuroscience, neural networks, and signal processing. First, the authors lay down the preliminary neuroscience background for engineers. The book also presents an overview of the role of correlation in the human brain as well as in the adaptive signal processing world; unifies many well-established synaptic adaptations (learning) rules within the correlation-based learning framework, focusing on a particular correlative learning paradigm, ALOPEX; and presents case studies that illustrate how to use different computational tools and ALOPEX to help readers understand certain brain functions or fit specific engineering applications.
Foreword xiii
Preface xv
Acknowledgments xxiii
Acronyms xxv
Introduction 1(7)
The Correlative Brain
8(64)
Background
8(11)
Spiking Neurons
8(6)
Neocortex
14(2)
Receptive Fields
16(2)
Thalamus
18(1)
Hippocampus
18(1)
Correlation Detection in Single Neurons
19(6)
Correlation in Ensembles of Neurons: Synchrony and Population Coding
25(6)
Correlation is the Basis of Novelty Detection and Learning
31(7)
Correlation in Sensory Systems: Coding, Perception, and Development
38(9)
Correlation in Memory Systems
47(5)
Correlation in Sensorimotor Learning
52(5)
Correlation, Feature Binding, and Attention
57(2)
Correlation and Cortical Map Changes after Peripheral Lesions and Brain Stimulation
59(8)
Discussion
67(5)
Correlation in Signal Processing
72(57)
Correlation and Spectrum Analysis
73(18)
Stationary Process
73(6)
Nonstationary Process
79(2)
Locally Stationary Process
81(2)
Cyclostationary Process
83(1)
Hilbert Spectrum Analysis
83(2)
Higher Order Correlation-Based Bispectra Analysis
85(2)
Higher Order Functions of Time, Frequency, Lag, and Doppler
87(2)
Spectrum Analysis of Random Point Process
89(2)
Wiener Filter
91(4)
Least-Mean-Square Filter
95(4)
Recursive Least-Squares Filter
99(1)
Matched Filter
100(2)
Higher Order Correlation-Based Filtering
102(2)
Correlation Detector
104(4)
Coherent Detection
104(2)
Correlation Filter for Spatial Target Detection
106(2)
Correlation Method for Time-Delay Estimation
108(2)
Correlation-Based Statistical Analysis
110(12)
Principal-Component Analysis
110(2)
Factor Analysis
112(1)
Canonical Correlation Analysis
113(5)
Fisher Linear Discriminant Analysis
118(1)
Common Spatial Pattern Analysis
119(3)
Discussion
122(7)
Appendix 2A: Eigenanalysis of Autocorrelation Function of Nonstationary Process
122(1)
Appendix 2B: Estimation of Intensity and Correlation Functions of Stationary Random Point Process
123(2)
Appendix 2C: Derivation of Learning Rules with Quasi-Newton Method
125(4)
correlation-based neural learning and machine learning
129(89)
Correlation as a Mathematical Basis for Learning
130(28)
Hebbian and Anti-Hebbian Rules (Revisited)
130(1)
Covariance Rule
131(1)
Grossberg's Gated Steepest Descent
132(1)
Competitive Learning Rule
133(2)
BCM Learning Rule
135(1)
Local PCA Learning Rule
136(4)
Generalizations of PCA Learning
140(4)
CCA Learning Rule
144(1)
Wake---Sleep Learning Rule for Factor Analysis
145(1)
Boltzmann Learning Rule
146(1)
Perceptron Rule and Error-Correcting Learning Rule
147(2)
Differential Hebbian Rule and Temporal Hebbian Learning
149(3)
Temporal Difference and Reinforcement Learning
152(4)
General Correlative Learning and Potential Function
156(2)
Information-Theoretic Learning
158(24)
Mutual Information versus Correlation
159(1)
Barlow's Postulate
159(1)
Hebbian Learning and Maximum Entropy
160(3)
Imax Algorithm
163(1)
Local Decorrelative Learning
164(3)
Blind Source Separation
167(2)
Independent-Component Analysis
169(5)
Slow Feature Analysis
174(2)
Energy-Efficient Hebbian Learning
176(2)
Discussion
178(4)
Correlation-Based Computational Neural Models
182(36)
Correlation Matrix Memory
182(2)
Hopfield Network
184(3)
Brain-State-in-a-Box Model
187(1)
Autoencoder Network
187(3)
Novelty Filter
190(1)
Neuronal Synchrony and Binding
191(2)
Oscillatory Correlation
193(1)
Modeling Auditory Functions
193(5)
Correlations in the Olfactory System
198(1)
Correlations in the Visual System
199(1)
Elastic Net
200(5)
CMAC and Motor Learning
205(2)
Summarizing Remarks
207(1)
Appendix 3A: Mathematical Analysis of Hebbian Learning
208(1)
Appendix 3B: Necessity and Convergence of Anti-Hebbian Learning
209(1)
Appendix 3C: Link between Hebbian Rule and Gradient Descent
210(1)
Appendix 3D: Reconstruction Error in Linear and Quadratic PCA
211(7)
Correlation-Based Kernel Learning
218(31)
Background
218(3)
Kernel PCA and Kernelized GHA
221(4)
Kernel CCA and Kernel ICA
225(5)
Kernel Principal Angles
230(2)
Kernel Discriminant Analysis
232(3)
Kernel Wiener Filter
235(3)
Kernel-Based Correlation Analysis: Generalized Correlation Function and Correntropy
238(4)
Kernel Matched Filter
242(1)
Discussion
243(6)
Correlative Learning in a Complex-Valued Domain
249(34)
Preliminaries
250(7)
Complex-Valued Extensions of Correlation-Based Learning
257(20)
Complex-Valued Associative Memory
257(1)
Complex-Valued Boltzmann Machine
258(1)
Complex-Valued LMS Rule
259(3)
Complex-Valued PCA Learning
262(7)
Complex-Valued ICA Learning
269(4)
Constant-Modulus Algorithm
273(4)
Kernel Methods for Complex-Valued Data
277(3)
Reproducing Kernels in the Complex Domain
277(2)
Complex-Valued Kernel PCA
279(1)
Discussion
280(3)
Alopex: A Correlation-Based Learning Paradigm
283(24)
Background
283(1)
The Basic Alopex Rule
284(2)
Variants of Alopex
286(4)
Unnikrishnan and Venugopal's Alopex
286(1)
Bia's Alopex-B
287(1)
Improved Version of Alopex-B
288(1)
Two-Timescale Alopex
289(1)
Other Types of Correlation Mechanisms
290(1)
Discussion
290(5)
Monte Carlo Sampling-Based Alopex
295(12)
Sequential Monte Carlo Estimation
295(3)
Sampling-Based Alopex
298(4)
Remarks
302(1)
Appendix 6A: Asymptotic Analysis of Alopex Process
303(1)
Appendix 6B: Asymptotic Convergence Analysis of 2t-Alopex
304(3)
Case Studies
307(49)
Hebbian Competition as Basis for Cortical Map Reorganization?
308(12)
Learning Neurocompensator: Model-Based Hearing Compensation Strategy
320(13)
Background
320(1)
Model-Based Hearing Compensation Strategy
320(6)
Optimization
326(4)
Experimental Results
330(3)
Summary
333(1)
Online Training of Artificial Neural Networks
333(7)
Background
333(1)
Parameter Setup
334(1)
Online Option Price Prediction
334(2)
Online System Identification
336(3)
Summary
339(1)
Kalman Filtering in Computational Neural Modeling
340(16)
Background
340(2)
Overview of Kalman Filter in Modeling Brain Functions
342(4)
Kalman Filter for Learning Shape and Motion from Image Sequences
346(8)
General Remarks and Implications
354(2)
Discussion
356(7)
Summary: Why Correlation?
356(3)
Hebbian Plasticity and the Correlative Brain
357(1)
Correlation-Based Signal Processing
358(1)
Correlation-Based Machine Learning
358(1)
Epilogue: What Next?
359(4)
Generalizing the Correlation Measure
359(1)
Deciphering the Correlative Brain
360(3)
Appendix A Autocorrelation and Cross-Correlation Functions
363(5)
A.1 Autocorrelation Function
363(1)
A.2 Cross-Correlation Function
364(3)
A.3 Derivative Stochastic Processes
367(1)
Appendix B Stochastic Approximation
368(3)
Appendix C Primer on Linear Algebra
371(7)
C.1 Eigenanalysis
372(3)
C.2 Generalized Eigenvalue Problem
375(1)
C.3 SVD and Cholesky Factorization
375(1)
C.4 Gram-Schmidt Orthogonalization
376(1)
C.5 Principal Correlation
377(1)
Appendix D Probability Density and Entropy Estimators
378(6)
D.1 Gram-Charlier Expansion
379(2)
D.2 Edgeworth Expansion
381(1)
D.3 Order Statistics
381(1)
D.4 Kernel Estimator
382(2)
Appendix E Expectation-Maximization Algorithm
384(57)
E.1 Alternating Free-Energy Maximization
384(1)
E.2 Fitting Gaussian Mixture Model
385(56)
Index 441


Zhe Chen, PhD, is currently a Research Fellow in the Neuroscience Statistics Research Laboratory at Harvard Medical School.

Simon Haykin, PhD, DSc, is a Distinguished University Professor in the Department of Electrical and Computer Engineering at McMaster University, Ontario, Canada.

Jos J. Eggermont, PhD, is a Professor in the Departments of Physiology & Biophysics and Psychology at the University of Calgary, Alberta, Canada.

Suzanna Becker, PhD, is a Professor in the Department of Psychology, Neuroscience, and Behavior at McMaster University, Ontario, Canada.