Preface |
|
vii | |
|
Chapter 1. Introduction and Role of Artificial Neural Networks |
|
|
1 | (3) |
|
Chapter 2. Fundamentals of Biological Neural Networks |
|
|
4 | (4) |
|
Chapter 3. Basic Principles of ANN and Their Early Structures |
|
|
8 | (7) |
|
3.1 Basic Principles of ANN Design |
|
|
8 | (1) |
|
3.2 Basic Network Structures |
|
|
9 | (1) |
|
3.3 The Perceptron's Input-Output Principles |
|
|
10 | (1) |
|
|
11 | (4) |
|
|
15 | (8) |
|
|
16 | (1) |
|
4.A. Madaline Case-Study: Recognition of Two Characters |
|
|
17 | (6) |
|
Chapter 5. The Perceptron |
|
|
23 | (8) |
|
|
23 | (4) |
|
5.2. The Single-Layer Perceptron's Representation Problem |
|
|
27 | (1) |
|
5.3. The Limitations of the Single-Layer Perceptron |
|
|
28 | (1) |
|
5.4. Many-Layer Perceptrons |
|
|
29 | (2) |
|
Chapter 6. Back Propagation |
|
|
31 | (79) |
|
6.1. The Back Propagation Learning Procedure |
|
|
31 | (1) |
|
6.2. Derivation of the BP Algorithm |
|
|
31 | (4) |
|
6.3. Modified BP Algorithms |
|
|
35 | (2) |
|
6.A. Back Propagation Case Study: The Exclusive-OR (XOR) Problem (2-Layer BP) |
|
|
37 | (18) |
|
6.B. Back Propagation Case Study: The XOR Problem -- 3 Layer BP Network |
|
|
55 | (17) |
|
6.C. Back Propagation Case Study: Character Recognition |
|
|
72 | (25) |
|
6.D. Back Propagation Case Study: Identifying Autoregressive (AR) Parameters of a Signal (AR Time Series Identication) -- A Recurrent Deterministic Network |
|
|
97 | (13) |
|
Chapter 7. Hopfield Networks |
|
|
110 | (24) |
|
|
110 | (1) |
|
7.2. Binary Hopfield Networks |
|
|
110 | (1) |
|
7.3. Setting of Weights in Hopfield Nets -- Bidirectional Associative Memory (BAM) Principle |
|
|
111 | (2) |
|
|
113 | (2) |
|
|
115 | (2) |
|
7.6. Summary of the Procedure for Implementing the Hopfield Network |
|
|
117 | (1) |
|
7.7. Continuous Hopfield Models |
|
|
118 | (1) |
|
7.8. The Continuous Energy (Lyapunov) Function |
|
|
119 | (2) |
|
7.A. Hopfield Network Case Study: Recognition of Digits |
|
|
121 | (13) |
|
Chapter 8. Counter Propagation |
|
|
134 | (16) |
|
|
134 | (1) |
|
8.2. Kohonen Self-Organizing Map (SOM) Layer |
|
|
134 | (1) |
|
|
135 | (1) |
|
8.4. Training of the Kohonen Layer |
|
|
135 | (3) |
|
8.5. Training of Grossberg Layers |
|
|
138 | (1) |
|
8.6. The Combined Counter Propagation Network |
|
|
138 | (2) |
|
8.A. Counter Propagation Case Study: Recognition of Digits |
|
|
140 | (10) |
|
Chapter 9. Adaptive Resonance Theory |
|
|
150 | (16) |
|
|
150 | (1) |
|
9.2. The ART Network Structure |
|
|
150 | (3) |
|
9.3. Setting-Up of the ART Network |
|
|
153 | (1) |
|
|
154 | (2) |
|
|
156 | (1) |
|
9.6. Discussion and General Comments on ART-I and ART-II |
|
|
157 | (1) |
|
9.A. ART-I Case Study: Speech Recognition |
|
|
158 | (8) |
|
Chapter 10. The Cognitron and the Neocognitron |
|
|
166 | (5) |
|
10.1. Background of the Cognitron |
|
|
166 | (1) |
|
10.2. The Basic Principles of the Cognitron |
|
|
166 | (1) |
|
|
166 | (2) |
|
10.4. Cognitron's Network Training |
|
|
168 | (1) |
|
|
169 | (2) |
|
Chapter 11. Statistical Training |
|
|
171 | (16) |
|
11.1. Fundamental Philosophy |
|
|
171 | (1) |
|
|
172 | (1) |
|
11.3. Simulated Annealing by Boltzman Training of Weights |
|
|
172 | (1) |
|
11.4. Stochastic Determination of Magnitude of Weight Change |
|
|
173 | (1) |
|
11.5. Temperature-Equivalent Setting |
|
|
173 | (1) |
|
11.6. Cauchy Training of Neural Network |
|
|
173 | (2) |
|
11.A. Statistical Training Case Study: Identifying AR Signal Parameters with stochastic Back Propagation |
|
|
175 | (8) |
|
11.B. Statistical Training Case Study -- A Stochastic Hopfield Network for Character Recognition |
|
|
183 | (4) |
|
Chapter 12. Recurrent (Time Cycling) Back Propagation Networks |
|
|
187 | (4) |
|
12.1. Recurrent/Discrete Time Networks |
|
|
187 | (1) |
|
12.2. Fully Recurrent Networks |
|
|
188 | (1) |
|
12.3. Continuously Recurrent Back Propagation Networks |
|
|
189 | (2) |
|
Chapter 13. Large-Scale Memory Storage and Retrieval (LAMSTAR) Network |
|
|
191 | (32) |
|
|
191 | (1) |
|
|
192 | (1) |
|
13.3. An Outline of the LAMSTAR Network |
|
|
193 | (7) |
|
13.4. Information Representation and Storage |
|
|
200 | (3) |
|
13.5. Structure of SOM Modules in LAMSTAR Networks |
|
|
203 | (4) |
|
13.6. Interpolation/Extrapolation, Filtering and Forgetting |
|
|
207 | (1) |
|
13.7. Concluding Comments |
|
|
208 | (2) |
|
13.A. LAMSTAR Case Study: An Application to Medical Diagnosis |
|
|
210 | (13) |
Problems |
|
223 | (4) |
References |
|
227 | (6) |
Author Index |
|
233 | (2) |
Subject Index |
|
235 | |