On correctness of automatic differentiation for non-differentiable functions

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 104
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorLee, Wonyeolko
dc.contributor.authorYu, Hangyeolko
dc.contributor.authorRival, Xavierko
dc.contributor.authorYang, Hongseokko
dc.date.accessioned2021-11-09T06:49:07Z-
dc.date.available2021-11-09T06:49:07Z-
dc.date.created2021-11-08-
dc.date.issued2020-12-08-
dc.identifier.citationThe 34th Conference on Neural Information Processing Systems (NeurIPS 2020)-
dc.identifier.urihttp://hdl.handle.net/10203/289042-
dc.description.abstractDifferentiation lies at the core of many machine-learning algorithms, and is well-supported by popular autodiff systems, such as TensorFlow and PyTorch. Originally, these systems have been developed to compute derivatives of differentiable functions, but in practice, they are commonly applied to functions with non-differentiabilities. For instance, neural networks using ReLU define non-differentiable functions in general, but the gradients of losses involving those functions are computed using autodiff systems in practice. This status quo raises a natural question: are autodiff systems correct in any formal sense when they are applied to such non-differentiable functions? In this paper, we provide a positive answer to this question. Using counterexamples, we first point out flaws in often-used informal arguments, such as: non-differentiabilities arising in deep learning do not cause any issues because they form a measure-zero set. We then investigate a class of functions, called PAP functions, that includes nearly all (possibly non-differentiable) functions in deep learning nowadays. For these PAP functions, we propose a new type of derivatives, called intensional derivatives, and prove that these derivatives always exist and coincide with standard derivatives for almost all inputs. We also show that these intensional derivatives are what most autodiff systems compute or try to compute essentially. In this way, we formally establish the correctness of autodiff systems applied to non-differentiable functions.-
dc.languageEnglish-
dc.publisherNeural information processing systems foundation-
dc.titleOn correctness of automatic differentiation for non-differentiable functions-
dc.typeConference-
dc.type.rimsCONF-
dc.citation.publicationnameThe 34th Conference on Neural Information Processing Systems (NeurIPS 2020)-
dc.identifier.conferencecountryCN-
dc.identifier.conferencelocationVirtual-
dc.contributor.localauthorYang, Hongseok-
dc.contributor.nonIdAuthorLee, Wonyeol-
dc.contributor.nonIdAuthorYu, Hangyeol-
dc.contributor.nonIdAuthorRival, Xavier-
Appears in Collection
CS-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0