Muutke küpsiste eelistusi

Algorithmic High-Dimensional Robust Statistics [Kõva köide]

(University of Wisconsin-Madison), (University of California, San Diego)
  • Formaat: Hardback, 300 pages, kõrgus x laius x paksus: 236x158x22 mm, kaal: 580 g, Worked examples or Exercises
  • Ilmumisaeg: 07-Sep-2023
  • Kirjastus: Cambridge University Press
  • ISBN-10: 1108837816
  • ISBN-13: 9781108837811
Teised raamatud teemal:
  • Formaat: Hardback, 300 pages, kõrgus x laius x paksus: 236x158x22 mm, kaal: 580 g, Worked examples or Exercises
  • Ilmumisaeg: 07-Sep-2023
  • Kirjastus: Cambridge University Press
  • ISBN-10: 1108837816
  • ISBN-13: 9781108837811
Teised raamatud teemal:
"This reference text offers a clear unified treatment for graduate students, academic researchers, and professionals interested in understanding and developing statistical procedures for high-dimensional data that are robust to idealized modeling assumptions, including robustness to model misspecification and to adversarial outliers in the dataset"--

This reference text offers a clear unified treatment for graduate students, academic researchers, and professionals interested in understanding and developing statistical procedures for high-dimensional data that are robust to idealized modeling assumptions, including robustness to model misspecification and to adversarial outliers in the dataset.

Robust statistics is the study of designing estimators that perform well even when the dataset significantly deviates from the idealized modeling assumptions, such as in the presence of model misspecification or adversarial outliers in the dataset. The classical statistical theory, dating back to pioneering works by Tukey and Huber, characterizes the information-theoretic limits of robust estimation for most common problems. A recent line of work in computer science gave the first computationally efficient robust estimators in high dimensions for a range of learning tasks. This reference text for graduate students, researchers, and professionals in machine learning theory, provides an overview of recent developments in algorithmic high-dimensional robust statistics, presenting the underlying ideas in a clear and unified manner, while leveraging new perspectives on the developed techniques to provide streamlined proofs of these results. The most basic and illustrative results are analyzed in each chapter, while more tangential developments are explored in the exercises.

Arvustused

'This is a timely book on efficient algorithms for computing robust statistics from noisy data. It presents lucid intuitive descriptions of the algorithms as well as precise statements of results with rigorous proofs - a nice combination indeed. The topic has seen fundamental breakthroughs over the last few years and the authors are among the leading contributors. The reader will get a ringside view of the developments.' Ravi Kannan, Visiting Professor, Indian Institute of Science 'This volume was designed as a graduate textbook for a one-semester course, but it could also be useful for researchers and professionals in machine learning. While the foundational knowledge in computer science and statistics required is high, certain upper-level undergraduates could start their studies here. Recommended.' J. J. Meier, Choice

Muu info

This book presents general principles and scalable methodologies to deal with adversarial outliers in high-dimensional datasets.
1. Introduction to robust statistics;
2. Efficient high-dimensional robust mean estimation;
3. Algorithmic refinements in robust mean estimation;
4. Robust covariance estimation;
5. List-decodable learning;
6. Robust estimation via higher moments;
7. Robust supervised learning;
8. Information-computation tradeoffs in high-dimensional robust statistics; A. Mathematical background; References; Index.
Ilias Diakonikolas is an associate professor of computer science at the University of Wisconsin-Madison. His current research focuses on the algorithmic foundations of machine learning. Diakonikolas is a recipient of a number of research awards, including the best paper award at NeurIPS 2019. Daniel M. Kane is an associate professor at the University of California, San Diego in the departments of Computer Science and Mathematics. He is a four-time Putnam Fellow and two-time IMO gold medallist. Kane's research interests include number theory, combinatorics, computational complexity, and computational statistics.