Homepage
Open in app
Sign in
Get started
Neural Machine
Approximating the Universe
Follow
Knowledge Distillation
Knowledge Distillation
Knowledge distillation is model compression method in which a small model is trained to mimic a pretrained, larger model.
Ujjwal Upadhyay
Apr 4, 2018
About Neural Machine
Latest Stories
Archive
About Medium
Terms
Privacy
Teams