Muutke küpsiste eelistusi

E-raamat: Nonlinear Filters: Theory and Applications

, (McMaster University),
  • Formaat: PDF+DRM
  • Ilmumisaeg: 03-Mar-2022
  • Kirjastus: John Wiley & Sons Inc
  • Keel: eng
  • ISBN-13: 9781119078180
Teised raamatud teemal:
  • Formaat - PDF+DRM
  • Hind: 138,26 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Lisa ostukorvi
  • Lisa soovinimekirja
  • See e-raamat on mõeldud ainult isiklikuks kasutamiseks. E-raamatuid ei saa tagastada.
  • Raamatukogudele
  • Formaat: PDF+DRM
  • Ilmumisaeg: 03-Mar-2022
  • Kirjastus: John Wiley & Sons Inc
  • Keel: eng
  • ISBN-13: 9781119078180
Teised raamatud teemal:

DRM piirangud

  • Kopeerimine (copy/paste):

    ei ole lubatud

  • Printimine:

    ei ole lubatud

  • Kasutamine:

    Digitaalõiguste kaitse (DRM)
    Kirjastus on väljastanud selle e-raamatu krüpteeritud kujul, mis tähendab, et selle lugemiseks peate installeerima spetsiaalse tarkvara. Samuti peate looma endale  Adobe ID Rohkem infot siin. E-raamatut saab lugeda 1 kasutaja ning alla laadida kuni 6'de seadmesse (kõik autoriseeritud sama Adobe ID-ga).

    Vajalik tarkvara
    Mobiilsetes seadmetes (telefon või tahvelarvuti) lugemiseks peate installeerima selle tasuta rakenduse: PocketBook Reader (iOS / Android)

    PC või Mac seadmes lugemiseks peate installima Adobe Digital Editionsi (Seeon tasuta rakendus spetsiaalselt e-raamatute lugemiseks. Seda ei tohi segamini ajada Adober Reader'iga, mis tõenäoliselt on juba teie arvutisse installeeritud )

    Seda e-raamatut ei saa lugeda Amazon Kindle's. 

NONLINEAR FILTERS Discover the utility of using deep learning and (deep) reinforcement learning in deriving filtering algorithms with this insightful and powerful new resource

Nonlinear Filters: Theory and Applications delivers an insightful view on state and parameter estimation by merging ideas from control theory, statistical signal processing, and machine learning. Taking an algorithmic approach, the book covers both classic and machine learning-based filtering algorithms.

Readers of Nonlinear Filters will greatly benefit from the wide spectrum of presented topics including stability, robustness, computability, and algorithmic sufficiency. Readers will also enjoy:





Organization that allows the book to act as a stand-alone, self-contained reference A thorough exploration of the notion of observability, nonlinear observers, and the theory of optimal nonlinear filtering that bridges the gap between different science and engineering disciplines A profound account of Bayesian filters including Kalman filter and its variants as well as particle filter A rigorous derivation of the smooth variable structure filter as a predictor-corrector estimator formulated based on a stability theorem, used to confine the estimated states within a neighborhood of their true values A concise tutorial on deep learning and reinforcement learning A detailed presentation of the expectation maximization algorithm and its machine learning-based variants, used for joint state and parameter estimation Guidelines for constructing nonparametric Bayesian models from parametric ones

Perfect for researchers, professors, and graduate students in engineering, computer science, applied mathematics, and artificial intelligence, Nonlinear Filters: Theory and Applications will also earn a place in the libraries of those studying or practicing in fields involving pandemic diseases, cybersecurity, information fusion, augmented reality, autonomous driving, urban traffic network, navigation and tracking, robotics, power systems, hybrid technologies, and finance.
List of Figures
xiii
List of Table
xv
Preface xvii
Acknowledgments xix
Acronyms xxi
1 Introduction
1(6)
1.1 State of a Dynamic System
1(1)
1.2 State Estimation
1(1)
1.3 Construals of Computing
2(1)
1.4 Statistical Modeling
3(1)
1.5 Vision for the Book
4(3)
2 Observability
7(22)
2.1 Introduction
7(1)
2.2 State-Space Model
7(2)
2.3 The Concept of Observability
9(1)
2.4 Observability of Linear Time-Invariant Systems
10(4)
2.4.1 Continuous-Time LTI Systems
10(2)
2.4.2 Discrete-Time LTI Systems
12(2)
2.4.3 Discretization of LTI Systems
14(1)
2.5 Observability of Linear Time-Varying Systems
14(3)
2.5.1 Continuous-Time LTV Systems
14(2)
2.5.2 Discrete-Time LTV Systems
16(1)
2.5.3 Discretization of LTV Systems
17(1)
2.6 Observability of Nonlinear Systems
17(6)
2.6.1 Continuous-Time Nonlinear Systems
18(3)
2.6.2 Discrete-Time Nonlinear Systems
21(1)
2.6.3 Discretization of Nonlinear Systems
22(1)
2.7 Observability of Stochastic Systems
23(2)
2.8 Degree of Observability
25(1)
2.9 Invertibility
26(1)
2.10 Concluding Remarks
27(2)
3 Observers
29(12)
3.1 Introduction
29(1)
3.2 Luenberger Observer
30(1)
3.3 Extended Luenberger-Type Observer
31(2)
3.4 Sliding-Mode Observer
33(2)
3.5 Unknown-Input Observer
35(4)
3.6 Concluding Remarks
39(2)
4 Bayesian Paradigm and Optimal Nonlinear Filtering
41(8)
4.1 Introduction
41(1)
4.2 Bayes' Rule
42(1)
4.3 Optimal Nonlinear Filtering
42(3)
4.4 Fisher Information
45(1)
4.5 Posterior Cramer--Rao Lower Bound
46(1)
4.6 Concluding Remarks
47(2)
5 Kalman Filter
49(22)
5.1 Introduction
49(1)
5.2 Kalman Filter
50(3)
5.3 Kalman Smoother
53(1)
5.4 Information Filter
54(1)
5.5 Extended Kalman Filter
54(1)
5.6 Extended Information Filter
54(1)
5.7 Divided-Difference Filter
54(6)
5.8 Unscented Kalman Filter
60(1)
5.9 Cubature Kalman Filter
60(4)
5.10 Generalized PID Filter
64(1)
5.11 Gaussian-Sum Filter
65(2)
5.12 Applications
67(3)
5.12.1 Information Fusion
67(1)
5.12.2 Augmented Reality
67(1)
5.12.3 Urban Traffic Network
67(1)
5.12.4 Cybersecurity of Power Systems
67(1)
5.12.5 Incidence of Influenza
68(1)
5.12.6 COVID-19 Pandemic
68(2)
5.13 Concluding Remarks
70(1)
6 Particle Filter
71(14)
6.1 Introduction
71(1)
6.2 Monte Carlo Method
72(1)
6.3 Importance Sampling
72(1)
6.4 Sequential Importance Sampling
73(2)
6.5 Resampling
75(1)
6.6 Sample Impoverishment
76(1)
6.7 Choosing the Proposal Distribution
77(1)
6.8 Generic Particle Filter
78(3)
6.9 Applications
81(1)
6.9.1 Simultaneous Localization and Mapping
81(1)
6.10 Concluding Remarks
82(3)
7 Smooth Variable-Structure Filter
85(28)
7.1 Introduction
85(1)
1.2 The Switching Gain
86(4)
7.3 Stability Analysis
90(3)
7.4 Smoothing Subspace
93(3)
7.5 Filter Corrective Term for Linear Systems
96(6)
7.6 Filter Corrective Term for Nonlinear Systems
102(3)
1.1 Bias Compensation
105(2)
7.8 The Secondary Performance Indicator
107(1)
7.9 Second-Order Smooth Variable Structure Filter
108(1)
7.10 Optimal Smoothing Boundary Design
108(2)
7.11 Combination of SVSF with Other Filters
110(1)
7.12 Applications
110(1)
7.12.1 Multiple Target Tracking
111(1)
7.12.2 Battery State-of-Charge Estimation
111(1)
7.12.3 Robotics
111(1)
7.13 Concluding Remarks
111(2)
8 Deep Learning
113(28)
8.1 Introduction
113(1)
8.2 Gradient Descent
114(1)
8.3 Stochastic Gradient Descent
115(4)
8.4 Natural Gradient Descent
119(1)
8.5 Neural Networks
120(2)
8.6 Backpropagation
122(1)
8.7 Backpropagation Through Time
122(1)
8.8 Regularization
122(3)
8.9 Initialization
125(1)
8.10 Convolutional Neural Network
125(2)
8.11 Long Short-Term Memory
127(2)
8.12 Hebbian Learning
129(2)
8.13 Gibbs Sampling
131(1)
8.14 Boltzmann Machine
131(4)
8.15 Autoencoder
135(1)
8.16 Generative Adversarial Network
136(1)
8.17 Transformer
137(2)
8.18 Concluding Remarks
139(2)
9 Deep Learning-Based Filters
141(44)
9.1 Introduction
141(1)
9.2 Variational Inference
142(2)
9.3 Amortized Variational Inference
144(1)
9.4 Deep Kalman Filter
144(2)
9.5 Backpropagation Kalman Filter
146(2)
9.6 Differentiable Particle Filter
148(4)
9.7 Deep Rao--Blackwellized Particle Filter
152(6)
9.8 Deep Variational Bayes Filter
158(9)
9.9 Kalman Variational Autoencoder
167(5)
9.10 Deep Variational Information Bottleneck
172(4)
9.11 Wasserstein Distributionally Robust Kalman Filter
176(2)
9.12 Hierarchical Invertible Neural Transport
178(4)
9.13 Applications
182(1)
9.13.1 Prediction of Drug Effect
182(1)
9.13.2 Autonomous Driving
183(1)
9.14 Concluding Remarks
183(2)
10 Expectation Maximization
185(18)
10.1 Introduction
185(1)
10.2 Expectation Maximization Algorithm
185(3)
10.3 Particle Expectation Maximization
188(2)
10.4 Expectation Maximization for Gaussian Mixture Models
190(1)
10.5 Neural Expectation Maximization
191(3)
10.6 Relational Neural Expectation Maximization
194(2)
10.7 Variational Filtering Expectation Maximization
196(2)
10.8 Amortized Variational Filtering Expectation Maximization
198(1)
10.9 Applications
199(2)
10.9.1 Stochastic Volatility
199(1)
10.9.2 Physical Reasoning
200(1)
10.9.3 Speech, Music, and Video Modeling
200(1)
10.10 Concluding Remarks
201(2)
11 Reinforcement Learning-Based Filter
203(10)
11.1 Introduction
203(1)
11.2 Reinforcement Learning
204(3)
11.3 Variational Inference as Reinforcement Learning
207(3)
11.4 Application
210(1)
11.4.1 Battery State-of-Charge Estimation
210(1)
11.5 Concluding Remarks
210(3)
12 Nonparametric Bayesian Models
213(22)
12.1 Introduction
213(1)
12.2 Parametric vs Nonparametric Models
213(1)
12.3 Measure-Theoretic Probability
214(5)
12.4 Exchangeability
219(2)
12.5 Kolmogorov Extension Theorem
221(2)
12.6 Extension of Bayesian Models
223(1)
12.7 Conjugacy
224(2)
12.8 Construction of Nonparametric Bayesian Models
226(1)
12.9 Posterior Computability
227(1)
12.10 Algorithmic Sufficiency
228(4)
12.11 Applications
232(1)
12.11.1 Multiple Object Tracking
233(1)
12.11.2 Data-Driven Probabilistic Optimal Power Flow
233(1)
12.11.3 Analyzing Single-Molecule Tracks
233(1)
12.12 Concluding Remarks
233(2)
References 235(18)
Index 253
Peyman Setoodeh, PhD, is Visiting Professor with the Centre for Mechatronics and Hybrid Technologies (CMHT) at McMaster University. He is a Senior Member of the IEEE.

Saeid Habibi, PhD, is Professor and former Chair of the Department of Mechanical Engineering and the Director of the Centre for Mechatronics and Hybrid Technologies (CMHT) at McMaster University. He is a Fellow of the ASME and the CSME as well as a Canada Research Chair and a Senior NSERC Industrial Research Chair.

Simon Haykin, PhD, is Distinguished University Professor with the Department of Electrical and Computer Engineering and the Director of the Cognitive Systems Laboratory (CSL) at McMaster University. He is a Fellow of the IEEE and the Royal Society of Canada. He is a recipient of the Henry Booker Gold Medal from the International Union of Radio Science, the IEEE James H. Mulligan Jr. Education Medal, and the IEEE Denis J. Picard Medal for Radar Technologies and Applications.