Stochastic Chebyshev Gradient Descent for Spectral Optimization

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 217
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorHan, Insuko
dc.contributor.authorAvron, Haimko
dc.contributor.authorShin, Jinwooko
dc.date.accessioned2020-04-22T05:21:14Z-
dc.date.available2020-04-22T05:21:14Z-
dc.date.created2018-12-17-
dc.date.created2018-12-17-
dc.date.issued2018-12-06-
dc.identifier.citation32nd Conference on Neural Information Processing Systems (NIPS)-
dc.identifier.urihttp://hdl.handle.net/10203/273986-
dc.description.abstractA large class of machine learning techniques requires the solution of optimization problems involving spectral functions of parametric matrices, e.g. log-determinant and nuclear norm. Unfortunately, computing the gradient of a spectral function is generally of cubic complexity, as such gradient descent methods are rather expensive for optimizing objectives involving the spectral function. Thus, one naturally turns to stochastic gradient methods in hope that they will provide a way to reduce or altogether avoid the computation of full gradients. However, here a new challenge appears: there is no straightforward way to compute unbiased stochastic gradients for spectral functions. In this paper, we develop unbiased stochastic gradients for spectral-sums, an important subclass of spectral functions. Our unbiased stochastic gradients are based on combining randomized trace estimators with stochastic truncation of the Chebyshev expansions. A careful design of the truncation distribution allows us to offer distributions that are variance-optimal, which is crucial for fast and stable convergence of stochastic gradient methods. We further leverage our proposed stochastic gradients to devise stochastic methods for objective functions involving spectral-sums, and rigorously analyze their convergence rate. The utility of our methods is demonstrated in numerical experiments.-
dc.languageEnglish-
dc.publisherNeural Information Processing Systems Foundation-
dc.titleStochastic Chebyshev Gradient Descent for Spectral Optimization-
dc.typeConference-
dc.identifier.wosid000461852001090-
dc.identifier.scopusid2-s2.0-85064842697-
dc.type.rimsCONF-
dc.citation.publicationname32nd Conference on Neural Information Processing Systems (NIPS)-
dc.identifier.conferencecountryCN-
dc.identifier.conferencelocationMontreal Convention Centre-
dc.contributor.localauthorShin, Jinwoo-
dc.contributor.nonIdAuthorAvron, Haim-
Appears in Collection
RIMS Conference Papers
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0