DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | Yun, Se-Young | - |
dc.contributor.advisor | 윤세영 | - |
dc.contributor.author | Choi, Jinhwan | - |
dc.date.accessioned | 2023-06-22T19:31:16Z | - |
dc.date.available | 2023-06-22T19:31:16Z | - |
dc.date.issued | 2022 | - |
dc.identifier.uri | http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=997687&flag=dissertation | en_US |
dc.identifier.uri | http://hdl.handle.net/10203/308192 | - |
dc.description | 학위논문(석사) - 한국과학기술원 : 김재철AI대학원, 2022.2,[iii, 34 p. :] | - |
dc.description.abstract | In the aspect of knowledge distillation, data augmentation techniques serve to augment input data, the medium through which knowledge is distilled from teacher to student. In this case, teacher network and augmentation for teacher pre-training affect the performance of augmentation. Thus, in this paper, we propose novel data augmentation search method with consideration of teacher network and augmentation for teacher. Based on automated augmentation, we demonstrate how to use KD loss to consider teacher network. Moreover, we propose $\textit{policy distance}$ to measure the difference between two augmentation policies. Policy distance is used to maximize the distance from teacher augmentation, in our objectives. We demonstrate the effect of our proposed method by analyzing data distribution changes by augmentations. Through the analysis of these various aspects, we show that our proposed method search an improved data augmentation policy for knowledge distillation. | - |
dc.language | eng | - |
dc.publisher | 한국과학기술원 | - |
dc.title | Automated augmentation for knowledge distillation | - |
dc.title.alternative | 지식 증류를 위한 자동화된 데이터 증강 기법 | - |
dc.type | Thesis(Master) | - |
dc.identifier.CNRN | 325007 | - |
dc.description.department | 한국과학기술원 :김재철AI대학원, | - |
dc.contributor.alternativeauthor | 최진환 | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.