Continual learning with linearized deep neural networks선형화된 심층 신경망을 이용한 지속 학습

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 79
  • Download : 0
DC FieldValueLanguage
dc.contributor.advisorKim, Junmo-
dc.contributor.advisor김준모-
dc.contributor.authorShon, Hyounguk-
dc.date.accessioned2023-06-26T19:33:49Z-
dc.date.available2023-06-26T19:33:49Z-
dc.date.issued2022-
dc.identifier.urihttp://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=997205&flag=dissertationen_US
dc.identifier.urihttp://hdl.handle.net/10203/309865-
dc.description학위논문(석사) - 한국과학기술원 : 전기및전자공학부, 2022.2,[iii, 21 p. :]-
dc.description.abstractWe propose a continual learning algorithm that effectively mitigates catastrophic forgetting that occurs when a deep neural network is trained on multiple tasks sequentially. Our method takes advantage of the pre-training of neural networks for effective continual learning. Based on the observation that quadratic parameter regularization is able to achieve the optimal continual learning policy with linear models, our algorithm $\textit{linearizes}$ the neural network and applies quadratic penalty to parameters by estimating the Fisher information matrix. We show that the proposed method can prevent forgetting while achieving high performance on image classification tasks. Our method can be used in data incremental and task incremental learning problems.-
dc.languageeng-
dc.publisher한국과학기술원-
dc.titleContinual learning with linearized deep neural networks-
dc.title.alternative선형화된 심층 신경망을 이용한 지속 학습-
dc.typeThesis(Master)-
dc.identifier.CNRN325007-
dc.description.department한국과학기술원 :전기및전자공학부,-
dc.contributor.alternativeauthor손형욱-
Appears in Collection
EE-Theses_Master(석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0