File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Distilling Knowledge via Knowledge Review

TitleDistilling Knowledge via Knowledge Review
Authors
Issue Date2021
Citation
Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2021, p. 5006-5015 How to Cite?
AbstractKnowledge distillation transfers knowledge from the teacher network to the student one, with the goal of greatly improving the performance of the student network. Previous methods mostly focus on proposing feature transformation and loss functions between the same level's features to improve the effectiveness. We differently study the factor of connection path cross levels between teacher and student networks, and reveal its great importance. For the first time in knowledge distillation, cross-stage connection paths are proposed. Our new review mechanism is effective and structurally simple. Our finally designed nested and compact framework requires negligible computation overhead, and outperforms other methods on a variety of tasks. We apply our method to classification, object detection, and instance segmentation tasks. All of them witness significant student network performance improvement.
Persistent Identifierhttp://hdl.handle.net/10722/333518
ISSN
2023 SCImago Journal Rankings: 10.331
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorChen, Pengguang-
dc.contributor.authorLiu, Shu-
dc.contributor.authorZhao, Hengshuang-
dc.contributor.authorJia, Jiaya-
dc.date.accessioned2023-10-06T05:20:07Z-
dc.date.available2023-10-06T05:20:07Z-
dc.date.issued2021-
dc.identifier.citationProceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2021, p. 5006-5015-
dc.identifier.issn1063-6919-
dc.identifier.urihttp://hdl.handle.net/10722/333518-
dc.description.abstractKnowledge distillation transfers knowledge from the teacher network to the student one, with the goal of greatly improving the performance of the student network. Previous methods mostly focus on proposing feature transformation and loss functions between the same level's features to improve the effectiveness. We differently study the factor of connection path cross levels between teacher and student networks, and reveal its great importance. For the first time in knowledge distillation, cross-stage connection paths are proposed. Our new review mechanism is effective and structurally simple. Our finally designed nested and compact framework requires negligible computation overhead, and outperforms other methods on a variety of tasks. We apply our method to classification, object detection, and instance segmentation tasks. All of them witness significant student network performance improvement.-
dc.languageeng-
dc.relation.ispartofProceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition-
dc.titleDistilling Knowledge via Knowledge Review-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/CVPR46437.2021.00497-
dc.identifier.scopuseid_2-s2.0-85118684683-
dc.identifier.spage5006-
dc.identifier.epage5015-
dc.identifier.isiWOS:000739917305021-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats