SyncedReview
Published in

SyncedReview

Microsoft & Xiamen U’s Progressive Distillation Method Sets New SOTA for Dense Retrieval

Knowledge distillation is a classic approach for transferring knowledge from a powerful teacher model to a smaller student model. While it might be assumed that a stronger teacher model would naturally result in a stronger student model, this is not always the case — especially when the teacher-student gap is large. As Xiamen University and…

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Synced

Synced

29K Followers

AI Technology & Industry Review — syncedreview.com | Newsletter: http://bit.ly/2IYL6Y2 | Share My Research http://bit.ly/2TrUPMI | Twitter: @Synced_Global