Dissecting Neural ODEs

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 475
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorMassaroli, Stefanoko
dc.contributor.authorPoli, Michaelko
dc.contributor.authorPark, Jinkyooko
dc.contributor.authorYamashita, Atsushiko
dc.contributor.authorAsma, Hajimeko
dc.date.accessioned2020-12-24T02:10:09Z-
dc.date.available2020-12-24T02:10:09Z-
dc.date.created2020-12-05-
dc.date.issued2020-12-09-
dc.identifier.citation34th Conference on Neural Information Processing Systems, NeurIPS 2020-
dc.identifier.urihttp://hdl.handle.net/10203/279061-
dc.description.abstractContinuous deep learning architectures have recently re–emerged as Neural Ordinary Differential Equations (Neural ODEs). This infinite–depth approach theoretically bridges the gap between deep learning and dynamical systems, offering a novel perspective. However, deciphering the inner working of these models is still an open challenge, as most applications apply them as generic black–box modules. In this work we “open the box”, further developing the continuous–depth formulation with the aim of clarifying the influence of several design choices on the underlying dynamics.-
dc.languageEnglish-
dc.publisherThe Neural Information Processing Systems-
dc.titleDissecting Neural ODEs-
dc.typeConference-
dc.type.rimsCONF-
dc.citation.publicationname34th Conference on Neural Information Processing Systems, NeurIPS 2020-
dc.identifier.conferencecountryCN-
dc.identifier.conferencelocationVirtual-
dc.contributor.localauthorPark, Jinkyoo-
dc.contributor.nonIdAuthorMassaroli, Stefano-
dc.contributor.nonIdAuthorPoli, Michael-
dc.contributor.nonIdAuthorYamashita, Atsushi-
dc.contributor.nonIdAuthorAsma, Hajime-
Appears in Collection
IE-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0