Muutke küpsiste eelistusi

E-raamat: Principles Of Artificial Neural Networks (2nd Edition)

Teised raamatud teemal:
  • Formaat - PDF+DRM
  • Hind: 51,48 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Lisa ostukorvi
  • Lisa soovinimekirja
  • See e-raamat on mõeldud ainult isiklikuks kasutamiseks. E-raamatuid ei saa tagastada.
  • Raamatukogudele
Teised raamatud teemal:

DRM piirangud

  • Kopeerimine (copy/paste):

    ei ole lubatud

  • Printimine:

    ei ole lubatud

  • Kasutamine:

    Digitaalõiguste kaitse (DRM)
    Kirjastus on väljastanud selle e-raamatu krüpteeritud kujul, mis tähendab, et selle lugemiseks peate installeerima spetsiaalse tarkvara. Samuti peate looma endale  Adobe ID Rohkem infot siin. E-raamatut saab lugeda 1 kasutaja ning alla laadida kuni 6'de seadmesse (kõik autoriseeritud sama Adobe ID-ga).

    Vajalik tarkvara
    Mobiilsetes seadmetes (telefon või tahvelarvuti) lugemiseks peate installeerima selle tasuta rakenduse: PocketBook Reader (iOS / Android)

    PC või Mac seadmes lugemiseks peate installima Adobe Digital Editionsi (Seeon tasuta rakendus spetsiaalselt e-raamatute lugemiseks. Seda ei tohi segamini ajada Adober Reader'iga, mis tõenäoliselt on juba teie arvutisse installeeritud )

    Seda e-raamatut ei saa lugeda Amazon Kindle's. 

The book should serve as a text for a university graduate course or for an advanced undergraduate course on neural networks in engineering and computer science departments. It should also serve as a self-study course for engineers and computer scientists in the industry. Covering major neural network approaches and architectures with the theories, this text presents detailed case studies for each of the approaches, accompanied with complete computer codes and the corresponding computed results. The case studies are designed to allow easy comparison of network performance to illustrate strengths and weaknesses of the different networks.
Acknowledgments vii
Preface to the First Edition ix
Preface to the Second Edition xi
Chapter
1. Introduction and Role of Artificial Neural Networks
1
Chapter
2. Fundamentals of Biological Neural Networks
5
Chapter
3. Basic Principles of ANNs and Their Early Structures
9
3.1. Basic Principles of ANN Design
9
3.2. Basic Network Structures
10
3.3. The Perceptron's Input-Output Principles
11
3.4. The Adaline (ALC)
12
Chapter
4. The Perceptron
17
4.1. The Basic Structure
17
4.2. The Single-Layer Representation Problem
22
4.3. The Limitations of the Single-Layer Perceptron
23
4.4. Many-Layer Perceptrons
24
4.A. Perceptron Case Study: Identifying Autoregressive Parameters of a Signal (AR Time Series Identification)
25
Chapter
5. The Madaline
37
5.1. Madaline Training
37
5.A. Madaline Case Study: Character Recognition
39
Chapter
6. Back Propagation
59
6.1. The Back Propagation Learning Procedure
59
6.2. Derivation of the BP Algorithm
59
6.3. Modified BP Algorithms
63
6.A. Back Propagation Case Study: Character Recognition
65
6.B. Back Propagation Case Study: The Exclusive-OR (XOR) Problem (2-Layer BP)
76
6.C. Back Propagation Case Study: The XOR Problem —3 Layer BP Network
94
Chapter
7. Hopfield Networks
113
7.1. Introduction
113
7.2. Binary Hopfield Networks
113
7.3. Setting of Weights in Hopfield Nets — Bidirectional Associative Memory (BAM) Principle
114
7.4. Walsh Functions
117
7.5. Network Stability
118
7.6. Summary of the Procedure for Implementing the Hopfield Network
121
7.7. Continuous Hopfield Models
122
7.8. The Continuous Energy (Lyapunov) Function
123
7.A. Hopfield Network Case Study: Character Recognition
125
7.B. Hopfield Network Case Study: Traveling Salesman Problem
136
Chapter
8. Counter Propagation
161
8.1. Introduction
161
8.2. Kohonen Self-Organizing Map (SOM) Layer
161
8.3. Grossberg Layer
162
8.4. Training of the Kohonen Layer
162
8.5. Training of Grossberg Layers
165
8.6. The Combined Counter Propagation Network
165
8.A. Counter Propagation Network Case Study: Character Recognition
166
Chapter
9. Adaptive Resonance Theory
179
9.1. Motivation
179
9.2. The ART Network Structure
179
9.3. Setting-Up of the ART Network
183
9.4. Network Operation
184
9.5. Properties of ART
186
9.6. Discussion and General Comments on ART-I and ART-II
186
9.A. ART-I Network Case Study: Character Recognition
187
9.B. ART-I Case Study: Speech Recognition
201
Chapter
10. The Cognitron and the Neocognitron
209
10.1. Background of the Cognitron
209
10.2. The Basic Principles of the Cognitron
209
10.3. Network Operation
209
10.4. Cognitron's Network Training
211
10.5. The Neocognitron
213
Chapter
11. Statistical Training
215
11.1. Fundamental Philosophy
215
11.2. Annealing Methods
216
11.3. Simulated Annealing by Boltzman Training of Weights
216
11.4. Stochastic Determination of Magnitude of Weight Change
217
11.5. Temperature-Equivalent Setting
217
11.6. Cauchy Training of Neural Network
217
11.A. Statistical Training Case Study — A Stochastic Hopfield Network for Character Recognition
219
11.B. Statistical Training Case Study: Identifying AR Signal Parameters with a Stochastic Perceptron Model
222
Chapter
12. Recurrent (Time Cycling) Back Propagation Networks
233
12.1. Recurrent/Discrete Time Networks
233
12.2. Fully Recurrent Networks
234
12.3. Continuously Recurrent Back Propagation Networks
235
12.A. Recurrent Back Propagation Case Study: Character Recognition
236
Chapter
13. Large Scale Memory Storage and Retrieval (LAMSTAR) Network
249
13.1. Basic Principles of the LAMSTAR Neural Network
249
13.2. Detailed Outline of the LAMSTAR Network
251
13.3. Forgetting Feature
257
13.4. Training vs. Operational Runs
258
13.5. Advanced Data Analysis Capabilities
259
13.6. Correlation, Interpolation, Extrapolation and Innovation-Detection
261
13.7. Concluding Comments and Discussion of Applicability
262
13.A. LAMSTAR Network Case Study: Character Recognition
265
13.B. Application to Medical Diagnosis Problems
280
Problems 285
References 291
Author Index 299
Subject Index 301