Muutke küpsiste eelistusi

Ensemble Methods: Foundations and Algorithms 2nd edition [Kõva köide]

(Nanjing University, China)
  • Formaat: Hardback, 348 pages, kõrgus x laius: 234x156 mm, kaal: 830 g, 4 Tables, black and white; 43 Line drawings, color; 27 Line drawings, black and white; 43 Illustrations, color; 27 Illustrations, black and white
  • Sari: Chapman & Hall/CRC Machine Learning & Pattern Recognition
  • Ilmumisaeg: 09-Mar-2025
  • Kirjastus: Chapman & Hall/CRC
  • ISBN-10: 1032960604
  • ISBN-13: 9781032960609
  • Formaat: Hardback, 348 pages, kõrgus x laius: 234x156 mm, kaal: 830 g, 4 Tables, black and white; 43 Line drawings, color; 27 Line drawings, black and white; 43 Illustrations, color; 27 Illustrations, black and white
  • Sari: Chapman & Hall/CRC Machine Learning & Pattern Recognition
  • Ilmumisaeg: 09-Mar-2025
  • Kirjastus: Chapman & Hall/CRC
  • ISBN-10: 1032960604
  • ISBN-13: 9781032960609
"Ensemble methods that train multiple learners and then combine them to use, with \textit{Boosting} and \textit{Bagging} as representatives, are well-known machine learning approaches. It has become common sense that an ensemble is usually significantly more accurate than a single learner, and ensemble methods have already achieved great success in various real-world tasks. Twelve years have passed since the publication of the first edition of the book in 2012 (Japanese and Chinese versions published in 2017 and 2020, respectively). Many significant advances in this field have been developed. First, many theoretical issues have been tackled, e.g., the fundamental question of \textit{why AdaBoost seems resistant to overfitting} gets addressed, so that nowwe understand much more about the essence of ensemble methods. Second, ensemble methods have been well developed in more machine learning fields, e.g., \textit{isolation forest} in anomaly detection, so that now we have powerful ensemble methods for tasks beyond conventional supervised learning. Third, ensemble mechanisms have also been found helpful in phenomenon and emerging areas such as deep learning and online learning. Therefore, it is time to present the second edition of the book. The book is intended to be written in a concise but comprehensive style, not to be too lengthy to make readers who just step into this field feel frightened. Due to the significantly increased amount of content, however, the current book is nearly half thicker than its anterior edition"--

Ensemble methods that train multiple learners and then combine them to use, with Boosting and Bagging as representatives, are well-known machine learning approaches. It has become common sense that an ensemble is usually significantly more accurate than a single learner, and ensemble methods have already achieved great success in various real-world tasks.

Twelve years have passed since the publication of the first edition of the book in 2012 (Japanese and Chinese versions published in 2017 and 2020, respectively). Many significant advances in this field have been developed. First, many theoretical issues have been tackled, for example, the fundamental question of why AdaBoost seems resistant to overfitting gets addressed, so that now we understand much more about the essence of ensemble methods. Second, ensemble methods have been well developed in more machine learning fields, e.g., isolation forest in anomaly detection, so that now we have powerful ensemble methods for tasks beyond conventional supervised learning.

Third, ensemble mechanisms have also been found helpful in emerging areas such as deep learning and online learning. This edition expands on the previous one with additional content to reflect the significant advances in the field, and is written in a concise but comprehensive style to be approachable to readers new to the subject.



Ensemble methods that train multiple learners and then combine them to use, with \textit{Boosting} and \textit{Bagging} as representatives, are well-known machine learning approaches. An ensemble is significantly more accurate than a single learner, and ensemble methods have already achieved great success in various real-world tasks.

Preface Notations
1. Introduction
2. Boosting
3. Bagging
4. Combination Methods
5. Diversity
6. Ensemble Pruning
7. Clustering Ensemble
8. Anomaly Detection and Isolation Forest
9. Semi-Supervised Ensemble
10. Class-Imbalance and Cost-Sensitive Ensemble
11. Deep Learning and Deep Forest
12. Advanced Topics References Index

Zhi-Hua Zhou, Professor of Computer Science and Artificial Intelligence at Nanjing University, President of IJCAI trustee, Fellow of the ACM, AAAI, AAAS, IEEE, recipient of the IEEE Computer Society Edward J. McCluskey Technical Achievement Award, CCF-ACM Artificial Intelligence Award.