Muutke küpsiste eelistusi
  • Formaat - EPUB+DRM
  • Hind: 58,49 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Lisa ostukorvi
  • Lisa soovinimekirja
  • See e-raamat on mõeldud ainult isiklikuks kasutamiseks. E-raamatuid ei saa tagastada.
  • Formaat: 220 pages
  • Ilmumisaeg: 28-Jun-2021
  • Kirjastus: Chapman & Hall/CRC
  • Keel: eng
  • ISBN-13: 9781000400823

DRM piirangud

  • Kopeerimine (copy/paste):

    ei ole lubatud

  • Printimine:

    ei ole lubatud

  • Kasutamine:

    Digitaalõiguste kaitse (DRM)
    Kirjastus on väljastanud selle e-raamatu krüpteeritud kujul, mis tähendab, et selle lugemiseks peate installeerima spetsiaalse tarkvara. Samuti peate looma endale  Adobe ID Rohkem infot siin. E-raamatut saab lugeda 1 kasutaja ning alla laadida kuni 6'de seadmesse (kõik autoriseeritud sama Adobe ID-ga).

    Vajalik tarkvara
    Mobiilsetes seadmetes (telefon või tahvelarvuti) lugemiseks peate installeerima selle tasuta rakenduse: PocketBook Reader (iOS / Android)

    PC või Mac seadmes lugemiseks peate installima Adobe Digital Editionsi (Seeon tasuta rakendus spetsiaalselt e-raamatute lugemiseks. Seda ei tohi segamini ajada Adober Reader'iga, mis tõenäoliselt on juba teie arvutisse installeeritud )

    Seda e-raamatut ei saa lugeda Amazon Kindle's. 

Transfer Learning through Embedding Spaces provides a brief background on transfer learning and then focus on the idea of transferring knowledge through intermediate embedding spaces. The idea is to couple and relate different learning through embedding spaces that encode task-level relations and similarities.



Recent progress in artificial intelligence (AI) has revolutionized our everyday life. Many AI algorithms have reached human-level performance and AI agents are replacing humans in most professions. It is predicted that this trend will continue and 30% of work activities in 60% of current occupations will be automated.

This success, however, is conditioned on availability of huge annotated datasets to training AI models. Data annotation is a time-consuming and expensive task which still is being performed by human workers. Learning efficiently from less data is a next step for making AI more similar to natural intelligence. Transfer learning has been suggested a remedy to relax the need for data annotation. The core idea in transfer learning is to transfer knowledge across similar tasks and use similarities and previously learned knowledge to learn more efficiently.

In this book, we provide a brief background on transfer learning and then focus on the idea of transferring knowledge through intermediate embedding spaces. The idea is to couple and relate different learning through embedding spaces that encode task-level relations and similarities. We cover various machine learning scenarios and demonstrate that this idea can be used to overcome challenges of zero-shot learning, few-shot learning, domain adaptation, continual learning, lifelong learning, and collaborative learning. 

Introduction. Background and Related Work. Zero-Shot Image Classification through Coupled Visual and Semantic Embedding Spaces. Learning a Discriminative Embedding for Unsupervised Domain Adaptation. Few-Shot Image Classification through Coupled Embedding Spaces. Cross-Task Knowledge Transfer. Lifelong Zero-Shot Learning Using High-Level Task Descriptors. Complementary Learning Systems Theory for Tackling Catastrophic Forgetting. Continual Concept Learning. Collective Lifelong Learning for Multi-Agent Networks. Concluding Remarks and Potential Future Research Directions.
Mohammad Rostami is a computer scientist at USC Information Sciences Institute. He is a graduate of the University of Pennsylvania, University of Waterloo, and Sharif University of Technology. His research area includes continual machine learning and learning in data scarce regimes.