DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | Sung, Youngchul | - |
dc.contributor.advisor | 성영철 | - |
dc.contributor.author | Park, Jong Eui | - |
dc.date.accessioned | 2021-05-13T19:33:18Z | - |
dc.date.available | 2021-05-13T19:33:18Z | - |
dc.date.issued | 2020 | - |
dc.identifier.uri | http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=911330&flag=dissertation | en_US |
dc.identifier.uri | http://hdl.handle.net/10203/284721 | - |
dc.description | 학위논문(석사) - 한국과학기술원 : 전기및전자공학부, 2020.2,[iii, 20 p. :] | - |
dc.description.abstract | Neural networks have achieved remarkable success in various areas, but they still fail when the task changes over time. The subfield of machine learning that focuses on overcoming this problem is called continual learning. In this paper, we applied meta‐learning techniques to train a continual learning agent that utilizes an external memory via a novel scalar‐output architecture. Unlike previous works that use a fixed method to access and update memory, our agent learns the optimal way from scratch. We show that our agent successfully learned what to remember and how to recall, achieving final average accuracy about 10 % higher compared to other continual learning algorithms. | - |
dc.language | eng | - |
dc.publisher | 한국과학기술원 | - |
dc.subject | Continual Learning▼aDeep Learning▼aMeta Learning▼aSequential Learning▼aOnline Learning | - |
dc.subject | 연속학습▼a심층학습▼a메타학습▼a순차학습▼a온라인학습 | - |
dc.title | Meta-learning memory representations for continual learning with scalar-output networks | - |
dc.title.alternative | 스칼라 출력 신경망을 통한 연속학습에 적합한 기억 표현법 메타학습 | - |
dc.type | Thesis(Master) | - |
dc.identifier.CNRN | 325007 | - |
dc.description.department | 한국과학기술원 :전기및전자공학부, | - |
dc.contributor.alternativeauthor | 박종의 | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.