Decoupled training for long-tailed classification with stochastic representations확률적 표현을 활용한 긴 꼬리 분류를 위한 분리 학습

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 114
  • Download : 0
Decoupling representation learning and classifier learning has been shown to be effective in classification with long-tailed data. There are two main ingredients in constructing a decoupled learning scheme; 1) how to train the feature extractor for representation learning so that it provides generalizable representations and 2) how to re-train the classifier that constructs proper decision boundaries by handling class imbalances in long-tailed data. In this work, we first apply Stochastic Weight Averaging (SWA), an optimization technique for improving the generalization of deep neural networks, to obtain better generalizing feature extractors for long-tailed classification. We then propose a novel classifier re-training algorithm based on stochastic representation obtained from the SWA-Gaussian, a Gaussian perturbed SWA, and a self-distillation strategy that can harness the diverse stochastic representations based on uncertainty estimates to build more robust classifiers. Extensive experiments on several dataset benchmarks show that our proposed method improves upon previous methods both in terms of prediction accuracy and uncertainty estimation.
Advisors
Lee, Juhoresearcher이주호researcher
Description
한국과학기술원 :김재철AI대학원,
Publisher
한국과학기술원
Issue Date
2023
Identifier
325007
Language
eng
Description

학위논문(석사) - 한국과학기술원 : 김재철AI대학원, 2023.2,[iv, 34 p. :]

Keywords

Long-tailed classification▼aDecoupled learning▼aStochastic weight averaging▼aStochastic representations; 긴 꼬리 분류▼a분리 학습▼a확률적 가중치 평균화▼a확률적 표현

URI
http://hdl.handle.net/10203/308203
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=1032329&flag=dissertation
Appears in Collection
AI-Theses_Master(석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0