Dual network based complementary learning system for continual learning지속 학습을 위한 이중 네트워크 기반 보완 학습 시스템

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 102
  • Download : 0
DC FieldValueLanguage
dc.contributor.advisorSong, Iickho-
dc.contributor.advisor송익호-
dc.contributor.authorKumari, Geeta-
dc.date.accessioned2022-04-27T19:31:20Z-
dc.date.available2022-04-27T19:31:20Z-
dc.date.issued2021-
dc.identifier.urihttp://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=948999&flag=dissertationen_US
dc.identifier.urihttp://hdl.handle.net/10203/296003-
dc.description학위논문(석사) - 한국과학기술원 : 전기및전자공학부, 2021.2,[iv, 32 p. :]-
dc.description.abstractNeural networks have surpassed human performance at many individual tasks. However, when it comes to learning in a continual setting, where data comes in non-independent and identically distributed streams, they suffer heavily from a phenomenon called catastrophic forgetting, the loss of the information of past tasks with the learning of a new task. This is in contrast with how humans learn as they utilize past knowledge to learn new knowledge more efficiently. Overcoming catastrophic forgetting while facilitating future learning is one of the most challenging issues in machine learning currently. To address this issue, we propose a brain-inspired complementary dual network model, comprising a fast learner and a slow consolidator. The fast learner first adapts to a new task that is seen only once, and the slow consolidator then distills the new task information from the fast learner using knowledge distillation. The two networks are trained in an alternate manner. To consolidate the learning of new task with the learning of past tasks, we employ a small memory of each task for replay during the training of the slow consolidator. In addition, we implement a context-based gating mechanism on the slow consolidator, and empirically prove its positive impact on the performance of the proposed model. We show the results of our model on MNIST Permutations, MNIST Rotations, MNIST Split and Incremental CIFAR100 datasets.-
dc.languageeng-
dc.publisher한국과학기술원-
dc.subjectCatastrophic forgetting▼acontinual learning▼aknowledge distillation▼amemory replay▼ameta-learning▼aneural networks▼aonline learning▼atransfer learning-
dc.subject기억 재생▼a신경망▼a연속 학습▼a온라인 학습▼a전이 학습▼a지식 증류▼a초월 학습▼a치명적 망각-
dc.titleDual network based complementary learning system for continual learning-
dc.title.alternative지속 학습을 위한 이중 네트워크 기반 보완 학습 시스템-
dc.typeThesis(Master)-
dc.identifier.CNRN325007-
dc.description.department한국과학기술원 :전기및전자공학부,-
dc.contributor.alternativeauthor쿠마리 기타-
Appears in Collection
EE-Theses_Master(석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0