Preface |
|
xi | |
Acknowledgments |
|
xvii | |
|
Part I From data to methods |
|
|
|
Chapter 1 Patterns, objects, and features |
|
|
3 | (30) |
|
|
3 | (1) |
|
|
3 | (24) |
|
|
3 | (1) |
|
|
4 | (1) |
|
|
5 | (1) |
|
1.2.3.1 Delineating segments |
|
|
5 | (3) |
|
1.2.3.2 Delineating regions |
|
|
8 | (2) |
|
|
10 | (1) |
|
1.2.4.1 Karhunen---Loeve transformation (Principal Component Analysis)... |
|
|
10 | (2) |
|
1.2.4.2 Independent Component Analysis |
|
|
12 | (1) |
|
1.2.4.3 Fourier transform |
|
|
13 | (1) |
|
1.2.4.4 Short-time Fourier transform and spectrograms |
|
|
14 | (1) |
|
1.2.4.5 Discrete wavelet transforms |
|
|
15 | (6) |
|
1.2.5 Standardization, normalization, and other preprocessing steps |
|
|
21 | (1) |
|
|
21 | (1) |
|
|
22 | (1) |
|
|
22 | (1) |
|
1.2.6 Curse of dimensionality |
|
|
23 | (1) |
|
|
24 | (3) |
|
Appendix 1 Basic notions on statistics |
|
|
27 | (6) |
|
A1.1 Statistical parameters of an ensemble |
|
|
27 | (4) |
|
A1.2 Distinction of ensembles |
|
|
31 | (2) |
|
Chapter 2 Supervised learning |
|
|
33 | (54) |
|
|
33 | (2) |
|
2.2 Discriminant analysis |
|
|
35 | (5) |
|
2.2.1 Test ban treaty---some history |
|
|
35 | (1) |
|
2.2.2 The Ms---mb criterion for nuclear test identification |
|
|
35 | (1) |
|
2.2.3 Linear Discriminant Analysis |
|
|
36 | (4) |
|
2.3 The linear perceptron |
|
|
40 | (6) |
|
2.4 Solving the XOR problem: classification using multilayer perceptrons (MLPs) |
|
|
46 | (7) |
|
2.4.1 Nonlinear perceptrons |
|
|
48 | (5) |
|
2.5 Support vector machines (SVMs) |
|
|
53 | (6) |
|
|
53 | (3) |
|
2.5.2 Nonlinear SVM, kernels |
|
|
56 | (3) |
|
2.6 Hidden Markov Models (HMMs)/sequential data |
|
|
59 | (10) |
|
2.6.1 Background---from patterns and classes to sequences and processes |
|
|
59 | (3) |
|
2.6.2 The three problems of HMMs |
|
|
62 | (5) |
|
2.6.3 Including prior knowledge/model dimensions and topology |
|
|
67 | (1) |
|
2.6.4 Extension to conditional random fields |
|
|
68 | (1) |
|
|
69 | (4) |
|
|
73 | (14) |
|
Appendix 2.1 Fisher's linear discriminant analysis |
|
|
73 | (3) |
|
Appendix 2.2 The perceptron |
|
|
76 | (2) |
|
Appendix 2.3 SVM optimization of the margins |
|
|
78 | (2) |
|
Appendix 2.4 Hidden Markov models |
|
|
80 | (1) |
|
Appendix 2.4.1 Evaluation |
|
|
80 | (2) |
|
Appendix 2.4.2 Decoding---the Viterbi algorithm |
|
|
82 | (1) |
|
Appendix 2.4.3 Training---the expectation---maximization /Baum---Welch algorithm |
|
|
83 | (4) |
|
Chapter 3 Unsupervised learning |
|
|
87 | (40) |
|
|
87 | (27) |
|
3.1.1 Metrics of (dis)similarity |
|
|
88 | (2) |
|
|
90 | (1) |
|
3.1.2.1 Partitioning clustering |
|
|
91 | (13) |
|
3.1.2.2 Hierarchical clustering |
|
|
104 | (5) |
|
3.1.2.3 Density-based clustering |
|
|
109 | (5) |
|
|
114 | (5) |
|
|
117 | (2) |
|
|
119 | (8) |
|
Appendix 3.1 Analysis of variance (ANOVA) |
|
|
119 | (1) |
|
Appendix 3.2 Minimum distance property for the determinant criterion |
|
|
120 | (1) |
|
|
121 | (6) |
|
Part II Example applications |
|
|
|
Chapter 4 Applications of supervised learning |
|
|
127 | (62) |
|
|
127 | (1) |
|
4.2 Classification of seismic waveforms recorded on volcanoes |
|
|
128 | (11) |
|
4.2.1 Signal classification of explosion quakes at Stromboli |
|
|
132 | (5) |
|
4.2.2 Cross-validation issues |
|
|
137 | (2) |
|
4.3 Infrasound classification |
|
|
139 | (5) |
|
4.3.1 Infrasound monitoring at Mt Etna---classification with SVM |
|
|
141 | (3) |
|
4.4 SVM classification of rocks |
|
|
144 | (7) |
|
|
151 | (7) |
|
4.5.1 Identification of parameters governing seismic waveforms |
|
|
151 | (1) |
|
4.5.2 Integrated inversion of geophysical data |
|
|
152 | (6) |
|
4.6 MLP in regression and interpolation |
|
|
158 | (5) |
|
|
163 | (5) |
|
|
163 | (4) |
|
4.7.2 Brief considerations on pros and cons of SVM and MLP in regression problems |
|
|
167 | (1) |
|
4.8 Classification by hidden Markov models and dynamic Bayesian networks: application to seismic waveforms of tectonic, volcanic and lunar origin |
|
|
168 | (12) |
|
|
168 | (1) |
|
4.8.2 Signals related to volcanic and tectonic activity |
|
|
169 | (5) |
|
4.8.3 Classification of icequake and nonterrestrial seismic waveforms as base for further research ---HMM |
|
|
174 | (1) |
|
|
174 | (2) |
|
|
176 | (3) |
|
4.8.3.3 Classification of seismic waveforms using dynamic Bayesian networks |
|
|
179 | (1) |
|
4.9 Natural hazard analyses---HMMs and BNs |
|
|
180 | (3) |
|
4.9.1 Estimating volcanic unrest |
|
|
180 | (1) |
|
4.9.2 Reasoning under uncertainty---tsunami early warning tasks |
|
|
181 | (2) |
|
Appendix 4.1 Normalization issues |
|
|
183 | (1) |
|
Appendix 4.2 SVM Regression |
|
|
184 | (1) |
|
Appendix 4.3 Bias---Variance Trade-off in Curve Fitting |
|
|
185 | (4) |
|
Chapter 5 Applications with unsupervised learning |
|
|
189 | (48) |
|
|
189 | (2) |
|
5.2 Cluster analysis of volcanic tremor data |
|
|
191 | (6) |
|
5.3 Density based clustering |
|
|
197 | (6) |
|
|
203 | (10) |
|
5.5 Monitoring spectral characteristics of seismic signals and volcano alert |
|
|
213 | (9) |
|
|
222 | (8) |
|
|
230 | (7) |
|
Appendix 5.1 Davies-Bouldin index |
|
|
230 | (1) |
|
|
231 | (1) |
|
Appendix 5.3 Silhouette index |
|
|
232 | (1) |
|
|
232 | (1) |
|
Appendix 5.5 Variation of information |
|
|
233 | (4) |
|
Part III A posteriori analysis |
|
|
|
Chapter 6 A posteriori analyses---advantages and pitfalls of pattern recognition techniques |
|
|
237 | (24) |
|
|
237 | (1) |
|
|
238 | (1) |
|
|
239 | (7) |
|
|
246 | (5) |
|
|
251 | (2) |
|
|
253 | (4) |
|
|
257 | (4) |
|
6.7.1 Multilayer perceptrons |
|
|
257 | (1) |
|
6.7.2 Support Vector Machines |
|
|
258 | (1) |
|
6.7.3 MLP and SVM in regression analysis |
|
|
258 | (1) |
|
6.7.4 Hidden Markov models and Bayesian networks |
|
|
258 | (1) |
|
6.7.5 Supervised and unsupervised learning |
|
|
259 | (2) |
|
Chapter 7 Software manuals |
|
|
261 | (54) |
|
7.1 Example scripts related to Chapter 2 |
|
|
261 | (14) |
|
7.1.1 Linear discrimination, principal components, and marginal distributions |
|
|
261 | (6) |
|
|
267 | (1) |
|
7.1.3 Support Vector Machines |
|
|
268 | (4) |
|
7.1.4 HMM example routines |
|
|
272 | (3) |
|
7.2 Example scripts and programs related to Chapter 3 (unsupervised learning) |
|
|
275 | (25) |
|
|
275 | (1) |
|
|
275 | (3) |
|
7.2.3 Expectation maximization clusters |
|
|
278 | (1) |
|
|
279 | (1) |
|
7.2.5 Hierarchical clustering |
|
|
280 | (1) |
|
7.2.6 Density-based clustering |
|
|
280 | (2) |
|
7.2.7 Unsupervised learning toolbox: KKAnalysis |
|
|
282 | (1) |
|
|
282 | (1) |
|
|
283 | (1) |
|
|
283 | (2) |
|
|
285 | (11) |
|
7.2.7.5 Configuring KKAnalysis---the "settings" |
|
|
296 | (4) |
|
7.3 Programs related to applications (Chapter 4) |
|
|
300 | (7) |
|
7.3.1 Back propagation neural network (BPNN) |
|
|
300 | (2) |
|
|
302 | (5) |
|
|
307 | (8) |
|
7.4.1 DMGA---generating ground deformation, magnetic and gravity data |
|
|
307 | (4) |
|
7.4.2 Treating fault plane solution data |
|
|
311 | (4) |
Bibliography |
|
315 | (12) |
Index |
|
327 | |