Enhancing incremental few-shot learning algorithm : good embedding is all we need좋은 임베딩을 이용한 점진적인 소수 샷 학습 알고리즘 성능 개선

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 102
  • Download : 0
Few-shot class-incremental learning (FSCIL) aims to enable a neural network to learn new information well from a few data samples when new data is continually provided. In FSCIL, it is very important to prevent catastrophic forgetting, which is lost of existing knowledge. FSCIL method consists of two stages, pretraining and fine-tuning. In pretraining stage, neural network is trined with a plenty of samples of base classes. In fine-tuning stage, the pretrained network is fine-tuned with a few samples of novel classes. Various algorithms suggested until now generally focus on improvement in fine-tuning stage. However, according to F2M, when using prototype as classifier, it performs better than existing algorithms even without fine-tuning. This discovery raised a doubt on the effect of fine-tuning, which make us focus on pretraining stage. Consequently, when we try to improve the performance in pretraining stage, the performance is superior to existing algorithms even when the model is not fine-tuned. Based on this result, we conclude that Making a good embedding in pretraining is much more effective than trying to teach new data well in fine-tuning.
Advisors
Moon, Jaekyunresearcher문재균researcher
Description
한국과학기술원 :전기및전자공학부,
Publisher
한국과학기술원
Issue Date
2023
Identifier
325007
Language
eng
Description

학위논문(석사) - 한국과학기술원 : 전기및전자공학부, 2023.2,[iv, 19 p. :]

Keywords

Few-shot class-incremental learning▼aCatastrophic forgetting▼aPretraining▼aFine-tuning; 점진적인 소수샷 학습▼a파괴적 망각▼a사전학습▼a미세조정

URI
http://hdl.handle.net/10203/309902
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=1032863&flag=dissertation
Appears in Collection
EE-Theses_Master(석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0