Muutke küpsiste eelistusi

E-raamat: Principles Of Artificial Neural Networks

(Univ Of Illinois, Chicago, Usa), (-)
Teised raamatud teemal:
  • Formaat - PDF+DRM
  • Hind: 25,74 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Lisa ostukorvi
  • Lisa soovinimekirja
  • See e-raamat on mõeldud ainult isiklikuks kasutamiseks. E-raamatuid ei saa tagastada.
Teised raamatud teemal:

DRM piirangud

  • Kopeerimine (copy/paste):

    ei ole lubatud

  • Printimine:

    ei ole lubatud

  • Kasutamine:

    Digitaalõiguste kaitse (DRM)
    Kirjastus on väljastanud selle e-raamatu krüpteeritud kujul, mis tähendab, et selle lugemiseks peate installeerima spetsiaalse tarkvara. Samuti peate looma endale  Adobe ID Rohkem infot siin. E-raamatut saab lugeda 1 kasutaja ning alla laadida kuni 6'de seadmesse (kõik autoriseeritud sama Adobe ID-ga).

    Vajalik tarkvara
    Mobiilsetes seadmetes (telefon või tahvelarvuti) lugemiseks peate installeerima selle tasuta rakenduse: PocketBook Reader (iOS / Android)

    PC või Mac seadmes lugemiseks peate installima Adobe Digital Editionsi (Seeon tasuta rakendus spetsiaalselt e-raamatute lugemiseks. Seda ei tohi segamini ajada Adober Reader'iga, mis tõenäoliselt on juba teie arvutisse installeeritud )

    Seda e-raamatut ei saa lugeda Amazon Kindle's. 

This textbook is intended for a first-year graduate course on Artificial Neural Networks. It assumes no prior background in the subject and is directed to MS students in electrical engineering, computer science and related fields, with background in at least one programming language or in a programming tool such as Matlab, and who have taken the basic undergraduate classes in systems or in signal processing.The uniqueness of the book is in the breadth of its coverage over the range of all major artificial neural network approaches and in extensive hands-on case-studies on each and every neural network considered. These detailed case studies include complete program print-outs and results and deal with a range of problems, to illustrate the reader's ability to solve problems ranging from speech recognition, character recognition to control and signal processing problems, all on the basis of following the present text. Another unique aspect of the text is its coverage of important new topics of recurrent (time-cycling) networks and of large memory storage and retrieval problems.The text also attempts to show the reader how he can modify or combine one or more of the neural networks covered, to tailor them to a given problem which does not appear to fit any of the more standard designs, as is very often the case.
Preface vii
Chapter
1. Introduction and Role of Artificial Neural Networks
1(3)
Chapter
2. Fundamentals of Biological Neural Networks
4(4)
Chapter
3. Basic Principles of ANN and Their Early Structures
8(7)
3.1 Basic Principles of ANN Design
8(1)
3.2 Basic Network Structures
9(1)
3.3 The Perceptron's Input-Output Principles
10(1)
3.4 The Adaline (ALC)
11(4)
Chapter
4. The Madaline
15(8)
4.1 Madaline Training
16(1)
4.A. Madaline Case-Study: Recognition of Two Characters
17(6)
Chapter
5. The Perceptron
23(8)
5.1. The Basic Structure
23(4)
5.2. The Single-Layer Perceptron's Representation Problem
27(1)
5.3. The Limitations of the Single-Layer Perceptron
28(1)
5.4. Many-Layer Perceptrons
29(2)
Chapter
6. Back Propagation
31(79)
6.1. The Back Propagation Learning Procedure
31(1)
6.2. Derivation of the BP Algorithm
31(4)
6.3. Modified BP Algorithms
35(2)
6.A. Back Propagation Case Study: The Exclusive-OR (XOR) Problem (2-Layer BP)
37(18)
6.B. Back Propagation Case Study: The XOR Problem -- 3 Layer BP Network
55(17)
6.C. Back Propagation Case Study: Character Recognition
72(25)
6.D. Back Propagation Case Study: Identifying Autoregressive (AR) Parameters of a Signal (AR Time Series Identication) -- A Recurrent Deterministic Network
97(13)
Chapter
7. Hopfield Networks
110(24)
7.1. Introduction
110(1)
7.2. Binary Hopfield Networks
110(1)
7.3. Setting of Weights in Hopfield Nets -- Bidirectional Associative Memory (BAM) Principle
111(2)
7.4. Walsh Functions
113(2)
7.5. Network Stability
115(2)
7.6. Summary of the Procedure for Implementing the Hopfield Network
117(1)
7.7. Continuous Hopfield Models
118(1)
7.8. The Continuous Energy (Lyapunov) Function
119(2)
7.A. Hopfield Network Case Study: Recognition of Digits
121(13)
Chapter
8. Counter Propagation
134(16)
8.1. Introduction
134(1)
8.2. Kohonen Self-Organizing Map (SOM) Layer
134(1)
8.3. The Grossberg Layer
135(1)
8.4. Training of the Kohonen Layer
135(3)
8.5. Training of Grossberg Layers
138(1)
8.6. The Combined Counter Propagation Network
138(2)
8.A. Counter Propagation Case Study: Recognition of Digits
140(10)
Chapter
9. Adaptive Resonance Theory
150(16)
9.1. Motivation
150(1)
9.2. The ART Network Structure
150(3)
9.3. Setting-Up of the ART Network
153(1)
9.4. Network Operation
154(2)
9.5. Properties of ART
156(1)
9.6. Discussion and General Comments on ART-I and ART-II
157(1)
9.A. ART-I Case Study: Speech Recognition
158(8)
Chapter
10. The Cognitron and the Neocognitron
166(5)
10.1. Background of the Cognitron
166(1)
10.2. The Basic Principles of the Cognitron
166(1)
10.3. Network Operation
166(2)
10.4. Cognitron's Network Training
168(1)
10.5. The Neocognitron
169(2)
Chapter
11. Statistical Training
171(16)
11.1. Fundamental Philosophy
171(1)
11.2. Annealing Methods
172(1)
11.3. Simulated Annealing by Boltzman Training of Weights
172(1)
11.4. Stochastic Determination of Magnitude of Weight Change
173(1)
11.5. Temperature-Equivalent Setting
173(1)
11.6. Cauchy Training of Neural Network
173(2)
11.A. Statistical Training Case Study: Identifying AR Signal Parameters with stochastic Back Propagation
175(8)
11.B. Statistical Training Case Study -- A Stochastic Hopfield Network for Character Recognition
183(4)
Chapter
12. Recurrent (Time Cycling) Back Propagation Networks
187(4)
12.1. Recurrent/Discrete Time Networks
187(1)
12.2. Fully Recurrent Networks
188(1)
12.3. Continuously Recurrent Back Propagation Networks
189(2)
Chapter
13. Large-Scale Memory Storage and Retrieval (LAMSTAR) Network
191(32)
13.1. Motivation
191(1)
13.2. General Principles
192(1)
13.3. An Outline of the LAMSTAR Network
193(7)
13.4. Information Representation and Storage
200(3)
13.5. Structure of SOM Modules in LAMSTAR Networks
203(4)
13.6. Interpolation/Extrapolation, Filtering and Forgetting
207(1)
13.7. Concluding Comments
208(2)
13.A. LAMSTAR Case Study: An Application to Medical Diagnosis
210(13)
Problems 223(4)
References 227(6)
Author Index 233(2)
Subject Index 235