Muutke küpsiste eelistusi

Classification, Parameter Estimation and State Estimation: An Engineering Approach Using MATLAB 2nd edition [Kõva köide]

, , (Fraunhofer Institute, Germany), (Delft University of Technology, The Netherlands), , (University of Twente, The Netherlands),
  • Formaat: Hardback, 480 pages, kõrgus x laius x paksus: 218x145x31 mm, kaal: 680 g
  • Ilmumisaeg: 27-Apr-2017
  • Kirjastus: John Wiley & Sons Inc
  • ISBN-10: 1119152437
  • ISBN-13: 9781119152439
  • Formaat: Hardback, 480 pages, kõrgus x laius x paksus: 218x145x31 mm, kaal: 680 g
  • Ilmumisaeg: 27-Apr-2017
  • Kirjastus: John Wiley & Sons Inc
  • ISBN-10: 1119152437
  • ISBN-13: 9781119152439

A practical introduction to intelligent computer vision theory, design, implementation, and technology

The past decade has witnessed epic growth in image processing and intelligent computer vision technology. Advancements in machine learning methods—especially among adaboost varieties and particle filtering methods—have made machine learning in intelligent computer vision more accurate and reliable than ever before. The need for expert coverage of the state of the art in this burgeoning field has never been greater, and this book satisfies that need. Fully updated and extensively revised, this 2nd Edition of the popular guide provides designers, data analysts, researchers and advanced post-graduates with a fundamental yet wholly practical introduction to intelligent computer vision. The authors walk you through the basics of computer vision, past and present, and they explore the more subtle intricacies of intelligent computer vision, with an emphasis on intelligent measurement systems. Using many timely, real-world examples, they explain and vividly demonstrate the latest developments in image and video processing techniques and technologies for machine learning in computer vision systems, including: 

  • PRTools5 software for MATLAB—especially the latest representation and generalization software toolbox for PRTools5
  • Machine learning applications for computer vision, with detailed discussions of contemporary state estimation techniques vs older content of particle filter methods
  • The latest techniques for classification and supervised learning, with an emphasis on Neural Network, Genetic State Estimation and other particle filter and AI state estimation methods
  • All new coverage of the Adaboost and its implementation in PRTools5.

A valuable working resource for professionals and an excellent introduction for advanced-level students, this 2nd Edition features a wealth of illustrative examples, ranging from basic techniques to advanced intelligent computer vision system implementations. Additional examples and tutorials, as well as a question and solution forum, can be found on a companion website.

Preface xi
About the Companion Website xv
1 Introduction
1(16)
1.1 The Scope of the Book
2(8)
1.1.1 Classification
3(1)
1.1.2 Parameter Estimation
4(1)
1.1.3 State Estimation
5(2)
1.1.4 Relations between the Subjects
7(3)
1.2 Engineering
10(2)
1.3 The Organization of the Book
12(2)
1.4 Changes from First Edition
14(1)
1.5 References
15(2)
2 PRTools Introduction
17(26)
2.1 Motivation
17(1)
2.2 Essential Concepts
18(4)
2.3 PRTools Organization Structure and Implementation
22(4)
2.4 Some Details about PRTools
26(16)
2.4.1 Datasets
26(4)
2.4.2 Datafiles
30(1)
2.4.3 Datafiles Help Information
31(3)
2.4.4 Classifiers and Mappings
34(2)
2.4.5 Mappings Help Information
36(2)
2.4.6 How to Write Your Own Mapping
38(4)
2.5 Selected Bibliography
42(1)
3 Detection and Classification
43(34)
3.1 Bayesian Classification
46(16)
3.1.1 Uniform Cost Function and Minimum Error Rate
53(3)
3.1.2 Normal Distributed Measurements; Linear and Quadratic Classifiers
56(6)
3.2 Rejection
62(4)
3.2.1 Minimum Error Rate Classification with Reject Option
63(3)
3.3 Detection: The Two-Class Case
66(8)
3.4 Selected Bibliography
74(3)
Exercises
74(3)
4 Parameter Estimation
77(38)
4.1 Bayesian Estimation
79(15)
4.1.1 MMSE Estimation
86(1)
4.1.2 MAP Estimation
87(1)
4.1.3 The Gaussian Case with Linear Sensors
88(1)
4.1.4 Maximum Likelihood Estimation
89(2)
4.1.5 Unbiased Linear MMSE Estimation
91(3)
4.2 Performance Estimators
94(6)
4.2.1 Bias and Covariance
95(4)
4.2.2 The Error Covariance of the Unbiased Linear MMSE Estimator
99(1)
4.3 Data Fitting
100(10)
4.3.1 Least Squares Fitting
101(3)
4.3.2 Fitting Using a Robust Error Norm
104(3)
4.3.3 Regression
107(3)
4.4 Overview of the Family of Estimators
110(1)
4.5 Selected Bibliography
111(4)
Exercises
112(3)
5 State Estimation
115(92)
5.1 A General Framework for Online Estimation
117(8)
5.1.1 Models
117(6)
5.1.2 Optimal Online Estimation
123(2)
5.2 Infinite Discrete-Time State Variables
125(22)
5.2.1 Optimal Online Estimation in Linear-Gaussian Systems
125(8)
5.2.2 Suboptimal Solutions for Non-linear Systems
133(14)
5.3 Finite Discrete-Time State Variables
147(16)
5.3.1 Hidden Markov Models
148(4)
5.3.2 Online State Estimation
152(4)
5.3.3 Offline State Estimation
156(7)
5.4 Mixed States and the Particle Filter
163(7)
5.4.1 Importance Sampling
164(2)
5.4.2 Resampling by Selection
166(1)
5.4.3 The Condensation Algorithm
167(3)
5.5 Genetic State Estimation
170(13)
5.5.1 The Genetic Algorithm
170(6)
5.5.2 Genetic State Estimation
176(1)
5.5.3 Computational Issues
177(6)
5.6 State Estimation in Practice
183(18)
5.6.1 System Identification
185(3)
5.6.2 Observability, Controllability and Stability
188(5)
5.6.3 Computational Issues
193(3)
5.6.4 Consistency Checks
196(5)
5.7 Selected Bibliography
201(6)
Exercises
204(3)
6 Supervised Learning
207(52)
6.1 Training Sets
208(2)
6.2 Parametric Learning
210(7)
6.2.1 Gaussian Distribution, Mean Unknown
211(1)
6.2.2 Gaussian Distribution, Covariance Matrix Unknown
212(1)
6.2.3 Gaussian Distribution, Mean and Covariance Matrix Both Unknown
213(2)
6.2.4 Estimation of the Prior Probabilities
215(1)
6.2.5 Binary Measurements
216(1)
6.3 Non-parametric Learning
217(28)
6.3.1 Parzen Estimation and Histogramming
218(5)
6.3.2 Nearest Neighbour Classification
223(7)
6.3.3 Linear Discriminant Functions
230(7)
6.3.4 The Support Vector Classifier
237(5)
6.3.5 The Feedforward Neural Network
242(3)
6.4 Adaptive Boosting -- Adaboost
245(4)
6.5 Convolutional Neural Networks (CNNs)
249(3)
6.5.1 Convolutional Neural Network Structure
249(2)
6.5.2 Computation and Training of CNNs
251(1)
6.6 Empirical Evaluation
252(5)
6.7 Selected Bibliography
257(2)
Exercises
257(2)
7 Feature Extraction and Selection
259(44)
7.1 Criteria for Selection and Extraction
261(11)
7.1.1 Interclass/Intraclass Distance
262(5)
7.1.2 Chernoff-Bhattacharyya Distance
267(3)
7.1.3 Other Criteria
270(2)
7.2 Feature Selection
272(16)
7.2.1 Branch-and-Bound
273(2)
7.2.2 Suboptimal Search
275(3)
7.2.3 Several New Methods of Feature Selection
278(9)
7.2.4 Implementation Issues
287(1)
7.3 Linear Feature Extraction
288(12)
7.3.1 Feature Extraction Based on the Bhattacharyya Distance with Gaussian Distributions
291(5)
7.3.2 Feature Extraction Based on Inter/Intra Class Distance
296(4)
7.4 References
300(3)
Exercises
300(3)
8 Unsupervised Learning
303(46)
8.1 Feature Reduction
304(16)
8.1.1 Principal Component Analysis
304(5)
8.1.2 Multidimensional Scaling
309(6)
8.1.3 Kernel Principal Component Analysis
315(5)
8.2 Clustering
320(25)
8.2.1 Hierarchical Clustering
323(4)
8.2.2 K-Means Clustering
327(2)
8.2.3 Mixture of Gaussians
329(6)
8.2.4 Mixture of probabilistic PCA
335(1)
8.2.5 Self-Organizing Maps
336(6)
8.2.6 Generative Topographic Mapping
342(3)
8.3 References
345(4)
Exercises
346(3)
9 Worked Out Examples
349(58)
9.1 Example on Image Classification with PRTools
349(12)
9.1.1 Example on Image Classification
349(5)
9.1.2 Example on Face Classification
354(3)
9.1.3 Example on Silhouette Classification
357(4)
9.2 Boston Housing Classification Problem
361(11)
9.2.1 Dataset Description
361(2)
9.2.2 Simple Classification Methods
363(2)
9.2.3 Feature Extraction
365(2)
9.2.4 Feature Selection
367(1)
9.2.5 Complex Classifiers
368(3)
9.2.6 Conclusions
371(1)
9.3 Time-of-Flight Estimation of an Acoustic Tone Burst
372(20)
9.3.1 Models of the Observed Waveform
374(2)
9.3.2 Heuristic Methods for Determining the ToF
376(1)
9.3.3 Curve Fitting
377(2)
9.3.4 Matched Filtering
379(1)
9.3.5 ML Estimation Using Covariance Models for the Reflections
380(5)
9.3.6 Optimization and Evaluation
385(7)
9.4 Online Level Estimation in a Hydraulic System
392(14)
9.4.1 Linearized Kalman Filtering
394(3)
9.4.2 Extended Kalman Filtering
397(1)
9.4.3 Particle Filtering
398(5)
9.4.4 Discussion
403(3)
9.5 References
406(1)
Appendix A Topics Selected from Functional Analysis 407(14)
Appendix B Topics Selected from Linear Algebra and Matrix Theory 421(16)
Appendix C Probability Theory 437(16)
Appendix D Discrete-Time Dynamic Systems 453(6)
Index 459
Professor  Bangjun Lei, Dr. Guangzhu Xu,  and Dr. Ming Feng are with The Institute of Intelligent Vision and Image Information, China Three Gorges University, China.

Professor Yaobin Zou is an associate professor at China Three Gorges University.

Dr. Ferdinand van der Heijden, Ph.D., is on the faculty of theDepartment of Signals and Systems, University of Twente, Netherlands. 

Professor Dick de Ridder is Professor at the Bioinformatics lab at Wageningen University, Netherlands.

Professor David M. J. Tax, is a researcher with the Pattern Recognition laboratory, Delft University of Technology.