Knowledge distillation by blurred feature transfer특징 벡터 블러를 통한 뉴럴넷 지식 전달 방법

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 141
  • Download : 0
We propose a knowledge distillation method using feature blurring. We raised a problem of the previous methods which transfers exact value of the positive features. To use necessary information for training a network, we propose a distillation method which transfers the blurred feature. Our method is more simple and has less information loss than distillation methods which transform features to attention maps or encoding vectors. Student network trained by our method have better accuracy and are optimized under less constraints, which was verified in various datasets. In CIFAR-100, our method shows the best performance between several distillation methods. Especially, significant performance improvement was shown if the depth or architecture of networks are different. Our method performs better than our baseline, overhaul distillation in CIFAR-10.
Advisors
Shin, Jinwooresearcher신진우researcherKim, Junmoresearcher김준모researcher
Description
한국과학기술원 :전기및전자공학부,
Publisher
한국과학기술원
Issue Date
2020
Identifier
325007
Language
eng
Description

학위논문(석사) - 한국과학기술원 : 전기및전자공학부, 2020.2,[iii, 17 p. :]

Keywords

Feature Blurring▼aknowledge distillation▼astudent network▼ateacher network; 특징 블러링▼a지식 전달▼a학생 네트워크▼a선생님 네트워크

URI
http://hdl.handle.net/10203/284728
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=911337&flag=dissertation
Appears in Collection
EE-Theses_Master(석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0