Balancing knowledge distillation via reliability of knowledge지식의 신뢰도에 따른 균형 지식 증류 기법

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 154
  • Download : 0
Knowledge distillation is a popular network compressing method which improves the performance of a small network (student) by employing output logits of a pre-trained large network (teacher). However, previous studies undoubtedly trust that the teacher network would always give beneficial knowledge in the logits. In this study, we specify the problem that distilling unreliable knowledge from the prediction of teachers would cause degradation of students. To tackle this problem, we propose the balancing knowledge distillation method which regulates the degree of knowledge distillation by utilizing the prior data distribution from the trained teacher. The proposed method can reflect various data distributions that contain the reliability of knowledge. Our results show that the balancing method based on the prior data distribution improves knowledge distillation regardless of datasets.
Advisors
Kim, Dae-Shikresearcher김대식researcher
Description
한국과학기술원 :전기및전자공학부,
Publisher
한국과학기술원
Issue Date
2021
Identifier
325007
Language
eng
Description

학위논문(석사) - 한국과학기술원 : 전기및전자공학부, 2021.2,[iii, 18 p. :]

Keywords

Deep learning▼aNetwork compression▼aKnowledge distillation▼aReliability▼adata distribution; 딥러닝▼a네트워크 압축▼a지식 증류▼a지식 전달▼a지식 신뢰도▼a데이터 분포

URI
http://hdl.handle.net/10203/295927
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=948683&flag=dissertation
Appears in Collection
EE-Theses_Master(석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0