Acknowledgments |
|
vii | |
Preface to the First Edition |
|
ix | |
Preface to the Second Edition |
|
xi | |
Chapter 1. Introduction and Role of Artificial Neural Networks |
|
1 | |
Chapter 2. Fundamentals of Biological Neural Networks |
|
5 | |
Chapter 3. Basic Principles of ANNs and Their Early Structures |
|
9 | |
|
3.1. Basic Principles of ANN Design |
|
|
9 | |
|
3.2. Basic Network Structures |
|
|
10 | |
|
3.3. The Perceptron's Input-Output Principles |
|
|
11 | |
|
|
12 | |
Chapter 4. The Perceptron |
|
17 | |
|
|
17 | |
|
4.2. The Single-Layer Representation Problem |
|
|
22 | |
|
4.3. The Limitations of the Single-Layer Perceptron |
|
|
23 | |
|
4.4. Many-Layer Perceptrons |
|
|
24 | |
|
4.A. Perceptron Case Study: Identifying Autoregressive Parameters of a Signal (AR Time Series Identification) |
|
|
25 | |
Chapter 5. The Madaline |
|
37 | |
|
|
37 | |
|
5.A. Madaline Case Study: Character Recognition |
|
|
39 | |
Chapter 6. Back Propagation |
|
59 | |
|
6.1. The Back Propagation Learning Procedure |
|
|
59 | |
|
6.2. Derivation of the BP Algorithm |
|
|
59 | |
|
6.3. Modified BP Algorithms |
|
|
63 | |
|
6.A. Back Propagation Case Study: Character Recognition |
|
|
65 | |
|
6.B. Back Propagation Case Study: The Exclusive-OR (XOR) Problem (2-Layer BP) |
|
|
76 | |
|
6.C. Back Propagation Case Study: The XOR Problem —3 Layer BP Network |
|
|
94 | |
Chapter 7. Hopfield Networks |
|
113 | |
|
|
113 | |
|
7.2. Binary Hopfield Networks |
|
|
113 | |
|
7.3. Setting of Weights in Hopfield Nets — Bidirectional Associative Memory (BAM) Principle |
|
|
114 | |
|
|
117 | |
|
|
118 | |
|
7.6. Summary of the Procedure for Implementing the Hopfield Network |
|
|
121 | |
|
7.7. Continuous Hopfield Models |
|
|
122 | |
|
7.8. The Continuous Energy (Lyapunov) Function |
|
|
123 | |
|
7.A. Hopfield Network Case Study: Character Recognition |
|
|
125 | |
|
7.B. Hopfield Network Case Study: Traveling Salesman Problem |
|
|
136 | |
Chapter 8. Counter Propagation |
|
161 | |
|
|
161 | |
|
8.2. Kohonen Self-Organizing Map (SOM) Layer |
|
|
161 | |
|
|
162 | |
|
8.4. Training of the Kohonen Layer |
|
|
162 | |
|
8.5. Training of Grossberg Layers |
|
|
165 | |
|
8.6. The Combined Counter Propagation Network |
|
|
165 | |
|
8.A. Counter Propagation Network Case Study: Character Recognition |
|
|
166 | |
Chapter 9. Adaptive Resonance Theory |
|
179 | |
|
|
179 | |
|
9.2. The ART Network Structure |
|
|
179 | |
|
9.3. Setting-Up of the ART Network |
|
|
183 | |
|
|
184 | |
|
|
186 | |
|
9.6. Discussion and General Comments on ART-I and ART-II |
|
|
186 | |
|
9.A. ART-I Network Case Study: Character Recognition |
|
|
187 | |
|
9.B. ART-I Case Study: Speech Recognition |
|
|
201 | |
Chapter 10. The Cognitron and the Neocognitron |
|
209 | |
|
10.1. Background of the Cognitron |
|
|
209 | |
|
10.2. The Basic Principles of the Cognitron |
|
|
209 | |
|
|
209 | |
|
10.4. Cognitron's Network Training |
|
|
211 | |
|
|
213 | |
Chapter 11. Statistical Training |
|
215 | |
|
11.1. Fundamental Philosophy |
|
|
215 | |
|
|
216 | |
|
11.3. Simulated Annealing by Boltzman Training of Weights |
|
|
216 | |
|
11.4. Stochastic Determination of Magnitude of Weight Change |
|
|
217 | |
|
11.5. Temperature-Equivalent Setting |
|
|
217 | |
|
11.6. Cauchy Training of Neural Network |
|
|
217 | |
|
11.A. Statistical Training Case Study — A Stochastic Hopfield Network for Character Recognition |
|
|
219 | |
|
11.B. Statistical Training Case Study: Identifying AR Signal Parameters with a Stochastic Perceptron Model |
|
|
222 | |
Chapter 12. Recurrent (Time Cycling) Back Propagation Networks |
|
233 | |
|
12.1. Recurrent/Discrete Time Networks |
|
|
233 | |
|
12.2. Fully Recurrent Networks |
|
|
234 | |
|
12.3. Continuously Recurrent Back Propagation Networks |
|
|
235 | |
|
12.A. Recurrent Back Propagation Case Study: Character Recognition |
|
|
236 | |
Chapter 13. Large Scale Memory Storage and Retrieval (LAMSTAR) Network |
|
249 | |
|
13.1. Basic Principles of the LAMSTAR Neural Network |
|
|
249 | |
|
13.2. Detailed Outline of the LAMSTAR Network |
|
|
251 | |
|
|
257 | |
|
13.4. Training vs. Operational Runs |
|
|
258 | |
|
13.5. Advanced Data Analysis Capabilities |
|
|
259 | |
|
13.6. Correlation, Interpolation, Extrapolation and Innovation-Detection |
|
|
261 | |
|
13.7. Concluding Comments and Discussion of Applicability |
|
|
262 | |
|
13.A. LAMSTAR Network Case Study: Character Recognition |
|
|
265 | |
|
13.B. Application to Medical Diagnosis Problems |
|
|
280 | |
Problems |
|
285 | |
References |
|
291 | |
Author Index |
|
299 | |
Subject Index |
|
301 | |