Muutke küpsiste eelistusi

E-raamat: Principles Of Artificial Neural Networks (3rd Edition)

Teised raamatud teemal:
  • Formaat - PDF+DRM
  • Hind: 44,46 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Lisa ostukorvi
  • Lisa soovinimekirja
  • See e-raamat on mõeldud ainult isiklikuks kasutamiseks. E-raamatuid ei saa tagastada.
  • Raamatukogudele
Teised raamatud teemal:

DRM piirangud

  • Kopeerimine (copy/paste):

    ei ole lubatud

  • Printimine:

    ei ole lubatud

  • Kasutamine:

    Digitaalõiguste kaitse (DRM)
    Kirjastus on väljastanud selle e-raamatu krüpteeritud kujul, mis tähendab, et selle lugemiseks peate installeerima spetsiaalse tarkvara. Samuti peate looma endale  Adobe ID Rohkem infot siin. E-raamatut saab lugeda 1 kasutaja ning alla laadida kuni 6'de seadmesse (kõik autoriseeritud sama Adobe ID-ga).

    Vajalik tarkvara
    Mobiilsetes seadmetes (telefon või tahvelarvuti) lugemiseks peate installeerima selle tasuta rakenduse: PocketBook Reader (iOS / Android)

    PC või Mac seadmes lugemiseks peate installima Adobe Digital Editionsi (Seeon tasuta rakendus spetsiaalselt e-raamatute lugemiseks. Seda ei tohi segamini ajada Adober Reader'iga, mis tõenäoliselt on juba teie arvutisse installeeritud )

    Seda e-raamatut ei saa lugeda Amazon Kindle's. 

Artificial neural networks are most suitable for solving problems that are complex, ill-defined, highly nonlinear, of many and different variables, and/or stochastic. Such problems are abundant in medicine, in finance, in security and beyond.This volume covers the basic theory and architecture of the major artificial neural networks. Uniquely, it presents 18 complete case studies of applications of neural networks in various fields, ranging from cell-shape classification to micro-trading in finance and to constellation recognition all with their respective source codes. These case studies demonstrate to the readers in detail how such case studies are designed and executed and how their specific results are obtained.The book is written for a one-semester graduate or senior-level undergraduate course on artificial neural networks. It is also intended to be a self-study and a reference text for scientists, engineers and for researchers in medicine, finance and data mining.
Acknowledgments vii
Preface to the Third Edition ix
Preface to the Second Edition xi
Preface to the First Edition xiii
Chapter 1 Introduction and Role of Artificial Neural Networks 1(4)
Chapter 2 Fundamentals of Biological Neural Networks 5(4)
Chapter 3 Basic Principles of ANNs and Their Early Structures 9(8)
3.1 Basic Principles of ANN Design
9(1)
3.2 Basic Network Structures
10(1)
3.3 The Perceptron's Input-Output Principles
11(2)
3.4 The Adaline (ALC)
13(4)
Chapter 4 The Perceptron 17(20)
4.1 The Basic Structure
17(5)
4.2 The Single-Layer Representation Problem
22(1)
4.3 The Limitations of the Single-Layer Perceptron
22(2)
4.4 Many-Layer Perceptrons
24(1)
4.A Perceptron Case Study: Identifying Autoregressive Parameters of a Signal (AR Time Series Identification)
25(12)
Chapter 5 The Madaline 37(22)
5.1 Madaline Training
37(2)
5.A Madaline Case Study: Character Recognition
39(20)
Chapter 6 Back Propagation 59(64)
6.1 The Back Propagation Learning Procedure
59(1)
6.2 Derivation of the BP Algorithm
59(4)
6.3 Modified BP Algorithms
63(2)
6.A Back Propagation Case Study: Character Recognition
65(11)
6.B Back Propagation Case Study: The Exclusive-OR (XOR) Problem (2-Layer BP)
76(18)
6.C Back Propagation Case Study: The XOR Problem - 3 Layer BP Network
94(18)
6.D Average Monthly High and Low Temperature Prediction Using Backpropagation Neural Networks
112(11)
Chapter 7 Hopfield Networks 123(62)
7.1 Introduction
123(1)
7.2 Binary Hopfield Networks
123(2)
7.3 Setting of Weights in Hopfield Nets - Bidirectional Associative Memory (BAM) Principle
125(2)
7.4 Walsh Functions
127(2)
7.5 Network Stability
129(2)
7.6 Summary of the Procedure for Implementing the Hopfield Network
131(1)
7.7 Continuous Hopfield Models
132(1)
7.8 The Continuous Energy (Lyapunov) Function
133(2)
7.A Hopfield Network Case Study: Character Recognition
135(12)
7.B Hopfield Network Case Study: Traveling Salesman Problem
147(23)
7.C Cell Shape Detection Using Neural Networks
170(15)
Chapter 8 Counter Propagation 185(18)
8.1 Introduction
185(1)
8.2 Kohonen Self-Organizing Map (SOM) Layer
186(1)
8.3 Grossberg Layer
186(1)
8.4 Training of the Kohonen Layer
187(2)
8.5 Training of Grossberg Layers
189(1)
8.6 The Combined Counter Propagation Network
190(1)
8.A Counter Propagation Network Case Study: Character Recognition
190(13)
Chapter 9 Large Scale Memory Storage and Retrieval (LAMSTAR) Network 203(72)
9.1 Motivation
203(1)
9.2 Basic Principles of the LAMSTAR Neural Network
204(1)
9.3 Detailed Outline of the LAMSTAR Network
205(6)
9.4 Forgetting Feature
211(2)
9.5 Training vs. Operational Runs
213(1)
9.6 Operation in Face of Missing Data
213(1)
9.7 Advanced Data Analysis Capabilities
214(3)
9.8 Modified Version: Normalized Weights
217(1)
9.9 Concluding Comments and Discussion of Applicability
218(2)
9.A LAMSTAR Network Case Study: Character Recognition
220(16)
9.B Application to Medical Diagnosis Problems
236(4)
9.C Predicting Price Movement in Market Microstructure via LAMSTAR
240(13)
9.D Constellation Recognition
253(22)
Chapter 10 Adaptive Resonance Theory 275(30)
10.1 Motivation
275(1)
10.2 The ART Network Structure
275(4)
10.3 Setting-Up of the ART Network
279(1)
10.4 Network Operation
280(1)
10.5 Properties of ART
281(2)
10.6 Discussion and General Comments on ART-I and ART-II
283(1)
10.A ART-I Network Case Study: Character Recognition
283(14)
10.B ART-I Case Study: Speech Recognition
297(8)
Chapter 11 The Cognitron and the Neocognitron 305(6)
11.1 Background of the Cognitron
305(1)
11.2 The Basic Principles of the Cognitron
305(1)
11.3 Network Operation
305(2)
11.4 Cognitron's Network Training
307(2)
11.5 The Neocognitron
309(2)
Chapter 12 Statistical Training 311(16)
12.1 Fundamental Philosophy
311(1)
12.2 Annealing Methods
312(1)
12.3 Simulated Annealing by Boltzman Training of Weights
312(1)
12.4 Stochastic Determination of Magnitude of Weight Change
313(1)
12.5 Temperature-Equivalent Setting
313(1)
12.6 Cauchy Training of Neural Network
314(1)
12.A Statistical Training Case Study: A Stochastic Hopfield Network for Character Recognition
315(3)
12.B Statistical Training Case Study: Identifying AR Signal Parameters with a Stochastic Perceptron Model
318(9)
Chapter 13 Recurrent (Time Cycling) Back Propagation Networks 327(16)
13.1 Recurrent/Discrete Time Networks
327(1)
13.2 Fully Recurrent Networks
328(2)
13.3 Continuously Recurrent Back Propagation Networks
330(1)
13.A Recurrent Back Propagation Case Study: Character Recognition
330(13)
Problems 343(6)
References 349(8)
Author Index 357(4)
Subject Index 361