Muutke küpsiste eelistusi

Information Theory, Probability and Statistical Learning: A Festschrift in Honor of Andrew Barron [Kõva köide]

Edited by , Edited by , Edited by
  • Formaat: Hardback, 512 pages, kõrgus x laius: 235x155 mm, 31 Illustrations, color; 24 Illustrations, black and white
  • Ilmumisaeg: 13-May-2026
  • Kirjastus: Springer Nature Switzerland AG
  • ISBN-10: 3032139910
  • ISBN-13: 9783032139917
Teised raamatud teemal:
  • Kõva köide
  • Hind: 150,78 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Tavahind: 177,39 €
  • Säästad 15%
  • See raamat ei ole veel ilmunud. Raamatu kohalejõudmiseks kulub orienteeruvalt 3-4 nädalat peale raamatu väljaandmist.
  • Kogus:
  • Lisa ostukorvi
  • Tasuta tarne
  • Tellimisaeg 2-4 nädalat
  • Lisa soovinimekirja
  • Formaat: Hardback, 512 pages, kõrgus x laius: 235x155 mm, 31 Illustrations, color; 24 Illustrations, black and white
  • Ilmumisaeg: 13-May-2026
  • Kirjastus: Springer Nature Switzerland AG
  • ISBN-10: 3032139910
  • ISBN-13: 9783032139917
Teised raamatud teemal:
In 2024, Andrew Barron turned 65 and retired. This is a Festschrift volume honoring his career and contributions. Andrew R. Barron, a professor of Statistics and Data Science at Yale University, has been one of the most influential figures in information theory research over the past 40 years. He has made profound, broad and consistent contributions to information theory, as well as its interactions with probability theory, statistical learning, and neural networks. From his Ph.D. thesis work in 1985 until today, Barron has been recognized as a leader in both information theory and statistics, especially in the area where the two fields intersect and fertilize each other. There has been a powerful tradition of important work on this interface and it has had a strong impact on both fields. Through the introduction of novel ideas and techniques, and through his outstanding scholarship, Barron has clarified some of the foundations of the mathematical and statistical side of Shannon theory, and he has helped solidify our understanding of the connection between information theory and statistics. This volume consists of invited papers, by prominent researchers that either personally or through the topics of the work have some connection with Barron. The papers in this volume are written by people working in all three areas where Barron has made major contributions: Information theory, probability, and statistical learning. These topics are very timely as there is major current activity in all three areas, especially in connection with the explosive current advances in machine learning theory and its applications.
Information Theory.- Probability Theory.- Statistical Learning.
Ioannis Kontoyiannis is with the Department of Pure Mathematics and Mathematical Statistics at the University of Cambridge, where he is the Churchill Professor of Mathematics of Information. He has been awarded the Manning endowed assistant professorship; a Sloan Foundation Research Fellowship; an honorary Master of Arts Degree Ad Eundem by Brown University; and a Marie Curie Fellowship. He is a Fellow of the IEEE, the IMS, the AAIA, and the AIAA. He has published over 60 journal articles in leading international journals and over 120 conference papers in the top international conferences. He also holds two U.S. patents, and he has authored a textbook in probability theory. He has served on the editorial board of the American Mathematical Society's Quarterly of Applied Mathematics, the IEEE Transactions on Information Theory, Springer-Verlag's Acta Applicandae Mathematicae, the book series Lecture Notes in Mathematics by Springer-Verlag, and the online journal Entropy.



Jason M. Klusowski is an Assistant Professor in the Department of Operations Research and Financial Engineering (ORFE) at Princeton University. Prior to joining Princeton, he was an Assistant Professor in the Department of Statistics at Rutgers UniversityNew Brunswick. He received a Ph.D. in Statistics and Data Science from Yale University. His research explores the tradeoffs among interpretability, statistical accuracy, and computational feasibility in large-scale, data-driven systems. He is a recipient of the Alfred P. Sloan Research Fellowship in Mathematics, the National Science Foundation (NSF) CAREER Award, and the Howard B. Wentz, Jr., Junior Faculty Award from Princetons School of Engineering and Applied Science (SEAS). He currently serves as an Associate Editor for the probability and statistics journal Bernoulli.



Cynthia Rush is an Associate Professor of Statistics at Columbia University. She received a Ph.D. and M.A. in Statistics from Yale University in 2016 and 2011, respectively, and she completed her undergraduate coursework at the University of North Carolina at Chapel Hill where she obtained a B.S. in Mathematics in 2010. Her research focuses on high-dimensional statistics, message passing algorithms, statistical robustness, and information theory. Cynthia currently serves as an Associate Editor for Bernoulli and the IEEE Transactions on Information Theory.