Class Incremental Learning With Task-Selection

Cited 2 time in webofscience Cited 0 time in scopus
  • Hit : 763
  • Download : 0
Despite the success of the deep neural networks (DNNs), in case of incremental learning, DNNs are known to suffer from catastrophic forgetting problems which are the phenomenon of entirely forgetting previously learned task information upon learning current task information. To alleviate this problem, we propose a novel knowledge distillation-based class incremental learning method with a task-selective autoencoder (TsAE). By learning the TsAE to reconstruct the feature map of each task, the proposed method effectively memorizes not only the classes of the current task but also the classes of previously learned tasks. Since the proposed TsAE has a simple but powerful architecture, it can be easily generalized to other knowledge distillation-based class incremental learning methods. Our experimental results on various datasets, including iCIFAR-100 and iILSVRC-small, demonstrated that the proposed method achieves higher classification accuracy and less forgetting compared to the state-of-the-art methods.
Publisher
IEEE Signal Processing Society
Issue Date
2020-10-25
Language
English
Citation

IEEE International Conference on Image Processing (ICIP) 2020, pp.1846 - 1850

ISSN
1522-4880
DOI
10.1109/ICIP40778.2020.9190703
URI
http://hdl.handle.net/10203/274234
Appears in Collection
EE-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 2 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0