(A) study on effective knowledge distillation methods for compressing large-scale speech self-supervised learning models음성 자기지도학습 모델 압축을 위한 효과적인 지식 증류기법에 관한 연구

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 112
  • Download : 0
DC FieldValueLanguage
dc.contributor.advisorKim, Hoirin-
dc.contributor.advisor김회린-
dc.contributor.authorLee, Yeonghyeon-
dc.date.accessioned2023-06-26T19:33:54Z-
dc.date.available2023-06-26T19:33:54Z-
dc.date.issued2023-
dc.identifier.urihttp://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=1032916&flag=dissertationen_US
dc.identifier.urihttp://hdl.handle.net/10203/309881-
dc.description학위논문(석사) - 한국과학기술원 : 전기및전자공학부, 2023.2,[iv, 24 p. :]-
dc.description.abstractThe success of self-supervised learning in the domain of speech has led to the development of large-scale self-supervised models. However, using these kinds of large-scale models in practice can be costly, potentially limiting the use of these models especially in resource-constrained settings. In this regard, we propose a model compression method FitHuBERT, which uses a thinner and deeper architecture across almost all of its model components compared to prior work. Additionally, we propose knowledge distillation with hints to improve performance, and the use of time-reduction layers to increase efficiency. Evaluation results on the SUPERB benchmark show that our model outperforms previous work especially on content related tasks, while having fewer parameters and faster inference time.-
dc.languageeng-
dc.publisher한국과학기술원-
dc.subjectSelf-supervised Learning▼aKnowledge Distillation▼aRepresentation Learning-
dc.subject자기지도학습▼a지식 증류▼a표현 학습-
dc.title(A) study on effective knowledge distillation methods for compressing large-scale speech self-supervised learning models-
dc.title.alternative음성 자기지도학습 모델 압축을 위한 효과적인 지식 증류기법에 관한 연구-
dc.typeThesis(Master)-
dc.identifier.CNRN325007-
dc.description.department한국과학기술원 :전기및전자공학부,-
dc.contributor.alternativeauthor이영현-
Appears in Collection
EE-Theses_Master(석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0