Preface |
|
vii | |
Author Biographies |
|
ix | |
|
Chapter 1 Overview of support vector machines |
|
|
1 | (14) |
|
|
1 | (1) |
|
|
2 | (6) |
|
1.2.1 Maximal Interval Linear Classifier |
|
|
3 | (2) |
|
1.2.2 Kernel Functions and Kernel Matrix |
|
|
5 | (2) |
|
1.2.3 Optimization Theory |
|
|
7 | (1) |
|
1.3 Elements of Support Vector Machines |
|
|
8 | (2) |
|
1.4 Applications of Support Vector Machines |
|
|
10 | (5) |
|
|
12 | (3) |
|
Chapter 2 Support vector machines for classification and regression |
|
|
15 | (34) |
|
|
15 | (1) |
|
2.2 Kernel Functions and Dimension Superiority |
|
|
16 | (2) |
|
2.2.1 Notion of Kernel Functions |
|
|
16 | (2) |
|
|
18 | (1) |
|
2.3 Support Vector Machines for Classification |
|
|
18 | (9) |
|
2.3.1 Computing SVMs for Linearly Separable Case |
|
|
20 | (2) |
|
2.3.2 Computing SVMs for Linearly Inseparable Case |
|
|
22 | (1) |
|
2.3.2.1 Slack Variable-Based "Soft Margin" Technique |
|
|
23 | (1) |
|
2.3.2.2 Kernel Function-Based Nonlinear Mapping |
|
|
23 | (2) |
|
2.3.3 Application of SVC to Simulated Data |
|
|
25 | (2) |
|
2.4 Support Vector Machines for Regression |
|
|
27 | (8) |
|
2.4.1 ε-Band and ε-Insensitive Loss Function |
|
|
27 | (1) |
|
|
28 | (2) |
|
|
30 | (1) |
|
2.4.4 Application of SVR to Simulated Data |
|
|
30 | (5) |
|
2.5 Parametric Optimization for Support Vector Machines |
|
|
35 | (4) |
|
2.6 Variable Selection for Support Vector Machines |
|
|
39 | (1) |
|
2.7 Related Materials and Comments |
|
|
39 | (10) |
|
|
40 | (1) |
|
2.7.2 Kernel Functions and Quadratic Programming |
|
|
41 | (1) |
|
2.7.3 Dimension Increasing versus Dimension Reducing |
|
|
41 | (2) |
|
Appendix A Computation of Slack Variable-Based SVMs |
|
|
43 | (1) |
|
Appendix B Computation of Linear ε-SVR |
|
|
44 | (1) |
|
|
45 | (4) |
|
|
49 | (28) |
|
|
49 | (2) |
|
3.2 Kernel Methods: Three Key Ingredients |
|
|
51 | (10) |
|
3.2.1 Primal and Dual Forms |
|
|
51 | (3) |
|
|
54 | (3) |
|
3.2.3 Kernel Function and Kernel Matrix |
|
|
57 | (4) |
|
3.3 Modularity of Kernel Methods |
|
|
61 | (1) |
|
3.4 Kernel Principal Component Analysis |
|
|
62 | (3) |
|
3.5 Kernel Partial Least Squares |
|
|
65 | (2) |
|
3.6 Kernel Fisher Discriminant Analysis |
|
|
67 | (1) |
|
3.7 Relationship between Kernel Function and SVMs |
|
|
68 | (4) |
|
3.8 Kernel Matrix Pretreatment |
|
|
72 | (1) |
|
|
73 | (4) |
|
|
74 | (3) |
|
Chapter 4 Ensemble learning of support vector machines |
|
|
77 | (18) |
|
|
77 | (1) |
|
|
78 | (2) |
|
4.2.1 Idea of Ensemble Learning |
|
|
78 | (1) |
|
4.2.2 Diversity of Ensemble Learning |
|
|
79 | (1) |
|
4.3 Bagging Support Vector Machines |
|
|
80 | (1) |
|
4.4 Boosting Support Vector Machines |
|
|
81 | (14) |
|
4.4.1 Boosting: A Simple Example |
|
|
81 | (2) |
|
4.4.2 Boosting SVMs for Classification |
|
|
83 | (3) |
|
4.4.3 Boosting SVMs for Regression |
|
|
86 | (2) |
|
4.4.4 Further Consideration |
|
|
88 | (3) |
|
|
91 | (4) |
|
Chapter 5 Support vector machines applied to near-infrared spectroscopy |
|
|
95 | (20) |
|
|
95 | (1) |
|
5.2 Near-Infrared Spectroscopy |
|
|
96 | (2) |
|
5.3 Support Vector Machines for Classification of Near-Infrared Data |
|
|
98 | (7) |
|
5.3.1 Recognition of Blended Vinegar Based on Near-Infrared Spectroscopy |
|
|
98 | (6) |
|
5.3.2 Related Work on Support Vector Classification on NIR |
|
|
104 | (1) |
|
5.4 Support Vector Machines for Quantitative Analysis of Near-Infrared Data |
|
|
105 | (4) |
|
5.4.1 Correlating Diesel Boiling Points with NIR Spectra Using SVR |
|
|
105 | (3) |
|
5.4.2 Related Work on Support Vector Regression on NIR |
|
|
108 | (1) |
|
|
109 | (6) |
|
|
111 | (4) |
|
Chapter 6 Support vector machines and QSAR/QSPR |
|
|
115 | (34) |
|
|
115 | (1) |
|
6.2 Quantitative Structure-Activity/Property Relationship |
|
|
116 | (4) |
|
6.2.1 History of QSAR/QSPR and Molecular Descriptors |
|
|
116 | (3) |
|
6.2.2 Principles for QSAR Modeling |
|
|
119 | (1) |
|
6.3 Related QSAR/QSPR Studies Using SVMs |
|
|
120 | (1) |
|
6.4 Support Vector Machines for Regression |
|
|
121 | (18) |
|
6.4.1 Dataset Description |
|
|
121 | (1) |
|
6.4.2 Molecular Modeling and Descriptor Calculation |
|
|
122 | (1) |
|
6.4.3 Feature Selection Using a Generalized Cross-Validation Program |
|
|
122 | (3) |
|
6.4.4 Model Internal Validation |
|
|
125 | (1) |
|
6.4.5 PLS Regression Model |
|
|
126 | (1) |
|
6.4.6 BPN Regression Model |
|
|
127 | (1) |
|
|
127 | (8) |
|
6.4.8 Applicability Domain and External Validation |
|
|
135 | (3) |
|
6.4.9 Model Interpretation |
|
|
138 | (1) |
|
6.5 Support Vector Machines for Classification |
|
|
139 | (10) |
|
6.5.1 Two-Step Algorithm: KPCA Plus LSVM |
|
|
140 | (1) |
|
6.5.2 Dataset Description |
|
|
141 | (1) |
|
6.5.3 Performance Evaluation |
|
|
142 | (1) |
|
6.5.4 Effects of Model Parameters |
|
|
142 | (1) |
|
6.5.5 Prediction Results for Three SAR Datasets |
|
|
143 | (1) |
|
|
144 | (5) |
|
Chapter 7 Support vector machines applied to traditional Chinese medicine |
|
|
149 | (24) |
|
|
149 | (1) |
|
7.2 Traditional Chinese Medicines and Their Quality Control |
|
|
149 | (5) |
|
7.3 Recognition of Authentic PCR and PCRV Using SVM |
|
|
154 | (15) |
|
|
154 | (1) |
|
|
155 | (1) |
|
7.3.3 Recognition of Authentic PCR and PCRV Using Whole Chromatography |
|
|
155 | (6) |
|
7.3.4 Variable Selection Improves Performance of SVM |
|
|
161 | (8) |
|
|
169 | (4) |
|
|
169 | (4) |
|
Chapter 8 Support vector machines applied to OMICS study |
|
|
173 | (22) |
|
|
173 | (1) |
|
8.2 A Brief Description of OMICS Study |
|
|
173 | (2) |
|
8.3 Support Vector Machines in Genomics |
|
|
175 | (4) |
|
8.4 Support Vector Machines for Identifying Proteotypic Peptides in Proteomics |
|
|
179 | (9) |
|
8.5 Biomarker Discovery in Metabolomics Using Support Vector Machines |
|
|
188 | (1) |
|
|
189 | (6) |
|
|
190 | (5) |
Index |
|
195 | |