Preface |
|
iii | |
Glossary |
|
ix | |
|
|
1 | (6) |
|
|
1 | (3) |
|
|
4 | (2) |
|
|
6 | (1) |
|
2 Probability Theory and Stochastic Processes |
|
|
7 | (22) |
|
|
7 | (2) |
|
2.2 Introduction to Probability Theory |
|
|
9 | (13) |
|
2.2.1 Events and Random Variables |
|
|
11 | (2) |
|
2.2.1.1 Types of variables |
|
|
13 | (1) |
|
2.2.2 Probability Definition |
|
|
14 | (5) |
|
2.2.3 Axioms and Properties |
|
|
19 | (3) |
|
2.3 Probability Density Function |
|
|
22 | (2) |
|
|
24 | (3) |
|
|
27 | (2) |
|
3 Discrete Hidden Markov Models |
|
|
29 | (128) |
|
|
29 | (3) |
|
3.2 Hidden Markov Model Dynamics |
|
|
32 | (28) |
|
3.2.1 The Forward Algorithm |
|
|
37 | (8) |
|
3.2.2 The Backward Algorithm |
|
|
45 | (6) |
|
3.2.3 The Viterbi Algorithm |
|
|
51 | (9) |
|
3.3 Probability Transitions Estimation |
|
|
60 | (18) |
|
3.3.1 Maximum Likelihood Definition |
|
|
62 | (1) |
|
3.3.2 The Baum- Welch Training Algorithm |
|
|
63 | (9) |
|
3.3.2.1 Operation conditions for the Baum-Welch algorithm |
|
|
72 | (1) |
|
3.3.2.2 Parameter estimation using multiple trials |
|
|
73 | (1) |
|
3.3.2.3 Baum-Welch algorithm numerical stability |
|
|
74 | (4) |
|
3.4 Viterbi Training Algorithm |
|
|
78 | (4) |
|
3.5 Gradient-based Algorithms |
|
|
82 | (71) |
|
3.5.1 Partial Derivative of k |
|
|
82 | (1) |
|
3.5.1.1 Partial derivative of, in order to bij |
|
|
82 | (8) |
|
3.5.1.2 Partial derivative of k in order to bij |
|
|
90 | (7) |
|
3.5.2 Partial Derivative of k in order to c |
|
|
97 | (2) |
|
3.5.3 Performance Analysis of the Re-estimation Formulas |
|
|
99 | (5) |
|
3.5.4 Parameters Coercion by Re-parameterization |
|
|
104 | (3) |
|
|
107 | (3) |
|
3.5.5.1 Linear equality constraints |
|
|
110 | (7) |
|
3.5.5.2 Lagrange multipliers and Karush-Kuhn-Tucker conditions |
|
|
117 | (18) |
|
3.5.5.3 Linear inequality constraints |
|
|
135 | (8) |
|
3.5.5.4 Putting it all together |
|
|
143 | (5) |
|
3.5.5.5 Rosen's method applied to hidden Markov models |
|
|
148 | (5) |
|
3.6 Architectures for Markov Models |
|
|
153 | (1) |
|
|
154 | (3) |
|
4 Continuous Hidden Markov Models |
|
|
157 | (50) |
|
|
158 | (4) |
|
4.2 Probability Density Functions and Gaussian Mixtures |
|
|
162 | (28) |
|
4.2.1 Gaussian Functions in System Modeling |
|
|
163 | (4) |
|
4.2.2 Gaussian Function and Gaussian Mixture |
|
|
167 | (23) |
|
4.3 Continuous Hidden Markov Model Dynamics |
|
|
190 | (7) |
|
4.3.1 Forward, Backward and Viterbi Algorithms Revisited |
|
|
192 | (5) |
|
4.4 Continuous Observations Baum-Welch Training Algorithm |
|
|
197 | (7) |
|
|
204 | (3) |
|
5 Autoregressive Markov Models |
|
|
207 | (38) |
|
|
207 | (2) |
|
|
209 | (4) |
|
5.3 Likelihood and Probability Density for AR Models |
|
|
213 | (8) |
|
5.3.1 AR Model Probability Density Function |
|
|
213 | (6) |
|
5.3.2 Autoregressive Model Likelihood |
|
|
219 | (2) |
|
5.4 Likelihood of an ARMM |
|
|
221 | (2) |
|
5.5 ARMM Parameters Estimations |
|
|
223 | (11) |
|
5.5.1 Parameters Estimation |
|
|
226 | (8) |
|
5.6 Time Series Prediction with ARMM |
|
|
234 | (8) |
|
5.6.1 One Step Ahead Time Series Prediction |
|
|
235 | (2) |
|
5.6.2 Multiple Steps Ahead Time Series Prediction |
|
|
237 | (5) |
|
|
242 | (3) |
|
|
245 | (12) |
|
6.1 Cardiotocography Classification |
|
|
246 | (4) |
|
6.2 Solar Radiation Prediction |
|
|
250 | (5) |
|
|
255 | (2) |
References |
|
257 | (4) |
Index |
|
261 | (4) |
Color Figures Section |
|
265 | |