DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | Chung, Sae-Young | - |
dc.contributor.advisor | 정세영 | - |
dc.contributor.author | Lee, Jisoo | - |
dc.date.accessioned | 2021-05-11T19:33:31Z | - |
dc.date.available | 2021-05-11T19:33:31Z | - |
dc.date.issued | 2019 | - |
dc.identifier.uri | http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=875344&flag=dissertation | en_US |
dc.identifier.uri | http://hdl.handle.net/10203/283050 | - |
dc.description | 학위논문(석사) - 한국과학기술원 : 전기및전자공학부, 2019.8,[iii, 19 p. :] | - |
dc.description.abstract | Deep neural networks optimized with gradient-based method exhibit two different behaviors when trained on poorly-annotated datasets: generalization in the early stage and memorization in the later stage. We analyze these two behaviors by measuring the similarity of the learned patterns between an ensemble of networks. From the analysis, we find that some correctly-annotated examples incur small training losses on all networks in the ensemble during generalization, while wrongly-annotated examples do not. Based on the finding, we propose a robust training method, termed learning with ensemble consensus (LEC) where an ensemble of networks is trained using the examples incurring small training losses on all the networks in the ensemble. The proposed method removes effectively noisy examples from training batches, resulting in robustness on the highly corrupted datasets. | - |
dc.language | eng | - |
dc.publisher | 한국과학기술원 | - |
dc.subject | Deep neural network▼alabel corruption▼aensemble▼arepresentational similarity▼agradient-based optimization | - |
dc.subject | 심층 신경망▼a잡음 데이터▼a앙상블▼a표현 유사도▼a경사 하강법 | - |
dc.title | Robust training with ensemble consensus | - |
dc.title.alternative | 네트워크 앙상블의 합의를 이용한 잡음에 강인한 심층 신경망 학습법 | - |
dc.type | Thesis(Master) | - |
dc.identifier.CNRN | 325007 | - |
dc.description.department | 한국과학기술원 :전기및전자공학부, | - |
dc.contributor.alternativeauthor | 이지수 | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.