Acknowledgments |
|
vii | |
Preface to the Third Edition |
|
ix | |
Preface to the Second Edition |
|
xi | |
Preface to the First Edition |
|
xiii | |
Chapter 1 Introduction and Role of Artificial Neural Networks |
|
1 | (4) |
Chapter 2 Fundamentals of Biological Neural Networks |
|
5 | (4) |
Chapter 3 Basic Principles of ANNs and Their Early Structures |
|
9 | (8) |
|
3.1 Basic Principles of ANN Design |
|
|
9 | (1) |
|
3.2 Basic Network Structures |
|
|
10 | (1) |
|
3.3 The Perceptron's Input-Output Principles |
|
|
11 | (2) |
|
|
13 | (4) |
Chapter 4 The Perceptron |
|
17 | (20) |
|
|
17 | (5) |
|
4.2 The Single-Layer Representation Problem |
|
|
22 | (1) |
|
4.3 The Limitations of the Single-Layer Perceptron |
|
|
22 | (2) |
|
4.4 Many-Layer Perceptrons |
|
|
24 | (1) |
|
4.A Perceptron Case Study: Identifying Autoregressive Parameters of a Signal (AR Time Series Identification) |
|
|
25 | (12) |
Chapter 5 The Madaline |
|
37 | (22) |
|
|
37 | (2) |
|
5.A Madaline Case Study: Character Recognition |
|
|
39 | (20) |
Chapter 6 Back Propagation |
|
59 | (64) |
|
6.1 The Back Propagation Learning Procedure |
|
|
59 | (1) |
|
6.2 Derivation of the BP Algorithm |
|
|
59 | (4) |
|
6.3 Modified BP Algorithms |
|
|
63 | (2) |
|
6.A Back Propagation Case Study: Character Recognition |
|
|
65 | (11) |
|
6.B Back Propagation Case Study: The Exclusive-OR (XOR) Problem (2-Layer BP) |
|
|
76 | (18) |
|
6.C Back Propagation Case Study: The XOR Problem - 3 Layer BP Network |
|
|
94 | (18) |
|
6.D Average Monthly High and Low Temperature Prediction Using Backpropagation Neural Networks |
|
|
112 | (11) |
Chapter 7 Hopfield Networks |
|
123 | (62) |
|
|
123 | (1) |
|
7.2 Binary Hopfield Networks |
|
|
123 | (2) |
|
7.3 Setting of Weights in Hopfield Nets - Bidirectional Associative Memory (BAM) Principle |
|
|
125 | (2) |
|
|
127 | (2) |
|
|
129 | (2) |
|
7.6 Summary of the Procedure for Implementing the Hopfield Network |
|
|
131 | (1) |
|
7.7 Continuous Hopfield Models |
|
|
132 | (1) |
|
7.8 The Continuous Energy (Lyapunov) Function |
|
|
133 | (2) |
|
7.A Hopfield Network Case Study: Character Recognition |
|
|
135 | (12) |
|
7.B Hopfield Network Case Study: Traveling Salesman Problem |
|
|
147 | (23) |
|
7.C Cell Shape Detection Using Neural Networks |
|
|
170 | (15) |
Chapter 8 Counter Propagation |
|
185 | (18) |
|
|
185 | (1) |
|
8.2 Kohonen Self-Organizing Map (SOM) Layer |
|
|
186 | (1) |
|
|
186 | (1) |
|
8.4 Training of the Kohonen Layer |
|
|
187 | (2) |
|
8.5 Training of Grossberg Layers |
|
|
189 | (1) |
|
8.6 The Combined Counter Propagation Network |
|
|
190 | (1) |
|
8.A Counter Propagation Network Case Study: Character Recognition |
|
|
190 | (13) |
Chapter 9 Large Scale Memory Storage and Retrieval (LAMSTAR) Network |
|
203 | (72) |
|
|
203 | (1) |
|
9.2 Basic Principles of the LAMSTAR Neural Network |
|
|
204 | (1) |
|
9.3 Detailed Outline of the LAMSTAR Network |
|
|
205 | (6) |
|
|
211 | (2) |
|
9.5 Training vs. Operational Runs |
|
|
213 | (1) |
|
9.6 Operation in Face of Missing Data |
|
|
213 | (1) |
|
9.7 Advanced Data Analysis Capabilities |
|
|
214 | (3) |
|
9.8 Modified Version: Normalized Weights |
|
|
217 | (1) |
|
9.9 Concluding Comments and Discussion of Applicability |
|
|
218 | (2) |
|
9.A LAMSTAR Network Case Study: Character Recognition |
|
|
220 | (16) |
|
9.B Application to Medical Diagnosis Problems |
|
|
236 | (4) |
|
9.C Predicting Price Movement in Market Microstructure via LAMSTAR |
|
|
240 | (13) |
|
9.D Constellation Recognition |
|
|
253 | (22) |
Chapter 10 Adaptive Resonance Theory |
|
275 | (30) |
|
|
275 | (1) |
|
10.2 The ART Network Structure |
|
|
275 | (4) |
|
10.3 Setting-Up of the ART Network |
|
|
279 | (1) |
|
|
280 | (1) |
|
|
281 | (2) |
|
10.6 Discussion and General Comments on ART-I and ART-II |
|
|
283 | (1) |
|
10.A ART-I Network Case Study: Character Recognition |
|
|
283 | (14) |
|
10.B ART-I Case Study: Speech Recognition |
|
|
297 | (8) |
Chapter 11 The Cognitron and the Neocognitron |
|
305 | (6) |
|
11.1 Background of the Cognitron |
|
|
305 | (1) |
|
11.2 The Basic Principles of the Cognitron |
|
|
305 | (1) |
|
|
305 | (2) |
|
11.4 Cognitron's Network Training |
|
|
307 | (2) |
|
|
309 | (2) |
Chapter 12 Statistical Training |
|
311 | (16) |
|
12.1 Fundamental Philosophy |
|
|
311 | (1) |
|
|
312 | (1) |
|
12.3 Simulated Annealing by Boltzman Training of Weights |
|
|
312 | (1) |
|
12.4 Stochastic Determination of Magnitude of Weight Change |
|
|
313 | (1) |
|
12.5 Temperature-Equivalent Setting |
|
|
313 | (1) |
|
12.6 Cauchy Training of Neural Network |
|
|
314 | (1) |
|
12.A Statistical Training Case Study: A Stochastic Hopfield Network for Character Recognition |
|
|
315 | (3) |
|
12.B Statistical Training Case Study: Identifying AR Signal Parameters with a Stochastic Perceptron Model |
|
|
318 | (9) |
Chapter 13 Recurrent (Time Cycling) Back Propagation Networks |
|
327 | (16) |
|
13.1 Recurrent/Discrete Time Networks |
|
|
327 | (1) |
|
13.2 Fully Recurrent Networks |
|
|
328 | (2) |
|
13.3 Continuously Recurrent Back Propagation Networks |
|
|
330 | (1) |
|
13.A Recurrent Back Propagation Case Study: Character Recognition |
|
|
330 | (13) |
Problems |
|
343 | (6) |
References |
|
349 | (8) |
Author Index |
|
357 | (4) |
Subject Index |
|
361 | |