DC Field | Value | Language |
---|---|---|
dc.contributor.author | Lee, Jaeho | ko |
dc.contributor.author | Tack, Jihoon | ko |
dc.contributor.author | Lee, Namhoon | ko |
dc.contributor.author | Shin, Jinwoo | ko |
dc.date.accessioned | 2021-12-09T06:48:30Z | - |
dc.date.available | 2021-12-09T06:48:30Z | - |
dc.date.created | 2021-12-02 | - |
dc.date.issued | 2021-12-07 | - |
dc.identifier.citation | 35th Conference on Neural Information Processing Systems, NeurIPS 2021 | - |
dc.identifier.uri | http://hdl.handle.net/10203/290296 | - |
dc.description.abstract | Implicit neural representations are a promising new avenue of representing general signals by learning a continuous function that, parameterized as a neural network, maps the domain of a signal to its codomain; the mapping from spatial coordinates of an image to its pixel values, for example. Being capable of conveying fine details in a high dimensional signal, unboundedly of its domain, implicit neural representations ensure many advantages over conventional discrete representations. However, the current approach is difficult to scale for a large number of signals or a data set, since learning a neural representation---which is parameter heavy by itself---for each signal individually requires a lot of memory and computations. To address this issue, we propose to leverage a meta-learning approach in combination with network compression under a sparsity constraint, such that it renders a well-initialized sparse parameterization that evolves quickly to represent a set of unseen signals in the subsequent training. We empirically demonstrate that meta-learned sparse neural representations achieve a much smaller loss than dense meta-learned models with the same number of parameters, when trained to fit each signal using the same number of optimization steps. | - |
dc.language | English | - |
dc.publisher | Neural Information Processing Systems | - |
dc.title | Meta-Learning Sparse Implicit Neural Representations | - |
dc.type | Conference | - |
dc.type.rims | CONF | - |
dc.citation.publicationname | 35th Conference on Neural Information Processing Systems, NeurIPS 2021 | - |
dc.identifier.conferencecountry | US | - |
dc.identifier.conferencelocation | Virtual | - |
dc.contributor.localauthor | Shin, Jinwoo | - |
dc.contributor.nonIdAuthor | Lee, Namhoon | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.