DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | Hwang, Sung Ju | - |
dc.contributor.advisor | 황성주 | - |
dc.contributor.author | Lee, Hae Beom | - |
dc.date.accessioned | 2023-06-21T19:33:44Z | - |
dc.date.available | 2023-06-21T19:33:44Z | - |
dc.date.issued | 2022 | - |
dc.identifier.uri | http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=1007772&flag=dissertation | en_US |
dc.identifier.uri | http://hdl.handle.net/10203/307930 | - |
dc.description | 학위논문(박사) - 한국과학기술원 : 김재철AI대학원, 2022.8,[vii, 74 p. :] | - |
dc.description.abstract | We extend the conventional meta-learning frameworks to more realistic, practical, and large-scale learning scenarios. Firstly, realistic meta-learning assumes imbalances between classes and tasks, and also distributional shift between meta-training and meta-testing stage. Secondly, practical meta-learning aims to develop a versatile meta-knowledge that is agnostic to architectural differences. Lastly, we address large-scale meta-learning where a shared initialization or hyperparameter are efficiently learned over a heterogeneous set of many-shot tasks. In this paper, we show how we can efficiently and effectively address those challenging real-world meta-learning problems with various machine learning techniques such as variational inference, amortization, first-order approximation, Taylor approximation, Lipschitz assumption, and so on. | - |
dc.language | eng | - |
dc.publisher | 한국과학기술원 | - |
dc.subject | Meta-learning▼aTask distribution▼aDistributional shift▼aImbalance▼aLarge-scale▼aFirst-order approximation▼aHyperparameter optimization▼aGradient alignment | - |
dc.subject | 메타 학습▼a태스크 분포▼a분포 이동▼a불균형▼a대규모 학습▼a일차 근사법▼a하이퍼파라미터 최적화▼a그라디언트 정렬 | - |
dc.title | Towards real-world meta-learning | - |
dc.title.alternative | 현실 세계에 적합한 메타러닝 | - |
dc.type | Thesis(Ph.D) | - |
dc.identifier.CNRN | 325007 | - |
dc.description.department | 한국과학기술원 :김재철AI대학원, | - |
dc.contributor.alternativeauthor | 이해범 | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.