Knowledge distillation is a fundamental AI technique that allows researchers to create smaller, more efficient models by training a 'student' model using knowledge from a larger 'teacher' model. Originally developed by Geoffrey Hinton and colleagues at Google in 2015, the method works by having the teacher model share

6m read time From quantamagazine.org
Post cover image
Table of contents
Dark KnowledgeExplosive GrowthRelated:

Sort: