Muutke küpsiste eelistusi

Advancements in Knowledge Distillation: Towards New Horizons of Intelligent Systems 2023 ed. [Pehme köide]

Edited by , Edited by
  • Formaat: Paperback / softback, 232 pages, kõrgus x laius: 235x155 mm, 51 Illustrations, color; 19 Illustrations, black and white; VIII, 232 p. 70 illus., 51 illus. in color., 1 Paperback / softback
  • Sari: Studies in Computational Intelligence 1100
  • Ilmumisaeg: 15-Jun-2024
  • Kirjastus: Springer International Publishing AG
  • ISBN-10: 3031320972
  • ISBN-13: 9783031320972
Teised raamatud teemal:
  • Pehme köide
  • Hind: 187,67 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Tavahind: 220,79 €
  • Säästad 15%
  • Raamatu kohalejõudmiseks kirjastusest kulub orienteeruvalt 2-4 nädalat
  • Kogus:
  • Lisa ostukorvi
  • Tasuta tarne
  • Tellimisaeg 2-4 nädalat
  • Lisa soovinimekirja
  • Formaat: Paperback / softback, 232 pages, kõrgus x laius: 235x155 mm, 51 Illustrations, color; 19 Illustrations, black and white; VIII, 232 p. 70 illus., 51 illus. in color., 1 Paperback / softback
  • Sari: Studies in Computational Intelligence 1100
  • Ilmumisaeg: 15-Jun-2024
  • Kirjastus: Springer International Publishing AG
  • ISBN-10: 3031320972
  • ISBN-13: 9783031320972
Teised raamatud teemal:

The book provides a timely coverage of the paradigm of knowledge distillation—an efficient way of model compression. Knowledge distillation is positioned in a general setting of transfer learning, which effectively learns a lightweight student model from a large teacher model. The book covers a variety of training schemes, teacher–student architectures, and distillation algorithms. The book covers a wealth of topics including recent developments in vision and language learning, relational architectures, multi-task learning, and representative applications to image processing, computer vision, edge intelligence, and autonomous systems. The book is of relevance to a broad audience including researchers and practitioners active in the area of machine learning and pursuing fundamental and applied research in the area of advanced learning paradigms.

Categories of Response-Based, Feature-Based, and Relation-Based Knowledge Distillation.- A Geometric Perspective on Feature-Based Distillation.- Knowledge Distillation Across Vision and Language.- Knowledge Distillation in Granular Fuzzy Models by Solving Fuzzy Relation Equations.- Ensemble Knowledge Distillation for Edge Intelligence in Medical Applications.- Self-Distillation with the New Paradigm in Multi-Task Learning.- Knowledge Distillation for Autonomous Intelligent Unmanned System.