A system–theoretic perspective on continuous–depth deep learning지속적인 심층 학습에 대한 시스템-이론적 관점

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 216
  • Download : 0
Continuous deep learning architectures have recently re-emerged as variants of Neural Ordinary Differential Equations (Neural ODEs). The infinite-depth approach offered by these models theoretically bridges the gap between deep learning and dynamical systems; however, deciphering their inner working is still an open challenge and most of their applications are currently limited to the inclusion as generic black-box modules. In this thesis, we "open the box" and offer a system-theoretic perspective, including state augmentation strategies and robustness, with the aim of clarifying the influence of several design choices on the underlying dynamics. We introduce novel architectures: among them, a Galerkin-inspired depth-varying parameter model and neural ODEs with data-controlled vector fields. Finally, graphs are linked to the theory of continuous models, resulting in the class of Graph Neural Ordinary Differential Equations (GDEs).
Advisors
Park, Jinkyooresearcher박진규researcher
Description
한국과학기술원 :산업및시스템공학과,
Publisher
한국과학기술원
Issue Date
2020
Identifier
325007
Language
eng
Description

학위논문(석사) - 한국과학기술원 : 산업및시스템공학과, 2020.8,[v, 70 p. :]

Keywords

Deep Learning▼aDynamical Systems▼aNeural ODEs▼aGraph▼aGDE; 딥 러닝▼a다이나믹 시스템▼aNeural ODEs▼a그래프▼aGDE

URI
http://hdl.handle.net/10203/284903
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=925053&flag=dissertation
Appears in Collection
IE-Theses_Master(석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0