upcarta
  • Sign In
  • Sign Up
  • Explore
  • Search

Knowledge distillation: A good teacher is patient and consistent

  • Paper
  • Aug 21, 2022
  • #ArtificialIntelligence #ComputerScience
Xiaohua Zhai
@XiaohuaZhai
(Author)
Rohan Anil
@_arohan_
(Author)
Amelie Royer
@royaleerieme
(Author)
Alexander Kolesnikov
@__kolesnikov__
(Author)
Lucas Beyer
@giffmana
(Author)
Larisa Markeeva
@re_rayne
(Author)
arxiv.org
Read on arxiv.org
1 Recommender
1 Mention
There is a growing discrepancy in computer vision between large-scale models that achieve state-of-the-art performance and models that are affordable in practical applications. In t... Show More

There is a growing discrepancy in computer vision between large-scale models that achieve state-of-the-art performance and models that are affordable in practical applications. In this paper we address this issue and significantly bridge the gap between these two types of models. Throughout our empirical investigation we do not aim to necessarily propose a new method, but strive to identify a robust and effective recipe for making state-of-the-art large scale models affordable in practice. We demonstrate that, when performed correctly, knowledge distillation can be a powerful tool for reducing the size of large models without compromising their performance. In particular, we uncover that there are certain implicit design choices, which may drastically affect the effectiveness of distillation. Our key contribution is the explicit identification of these design choices, which were not previously articulated in the literature. We back up our findings by a comprehensive empirical study, demonstrate compelling results on a wide range of vision datasets and, in particular, obtain a state-of-the-art ResNet-50 model for ImageNet, which achieves 82.8% top-1 accuracy.

Show Less
Recommend
Post
Save
Complete
Collect
Mentions
See All
Pavel Izmailov @Pavel_Izmailov ยท Jun 11, 2021
  • Post
  • From Twitter
Also, see this nice tread on a paper by @giffmana @XiaohuaZhai @__kolesnikov__ @_arohan_ @royaleerieme and Larisa Markeeva: The paper came out just two days ago and is very related!
  • upcarta ©2025
  • Home
  • About
  • Terms
  • Privacy
  • Cookies
  • @upcarta