On the Convergence Analysis of the Optimized Gradient Method

Cited 18 time in webofscience Cited 0 time in scopus
  • Hit : 413
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorKim, Donghwanko
dc.contributor.authorFessler, Jeffrey A.ko
dc.date.accessioned2018-09-18T05:53:37Z-
dc.date.available2018-09-18T05:53:37Z-
dc.date.created2018-08-21-
dc.date.created2018-08-21-
dc.date.issued2017-01-
dc.identifier.citationJOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, v.172, no.1, pp.187 - 205-
dc.identifier.issn0022-3239-
dc.identifier.urihttp://hdl.handle.net/10203/245448-
dc.description.abstractThis paper considers the problem of unconstrained minimization of smooth convex functions having Lipschitz continuous gradients with known Lipschitz constant. We recently proposed the optimized gradient method for this problem and showed that it has a worst-case convergence bound for the cost function decrease that is twice as small as that of Nesterov's fast gradient method, yet has a similarly efficient practical implementation. Drori showed recently that the optimized gradient method has optimal complexity for the cost function decrease over the general class of first-order methods. This optimality makes it important to study fully the convergence properties of the optimized gradient method. The previous worst-case convergence bound for the optimized gradient method was derived for only the last iterate of a secondary sequence. This paper provides an analytic convergence bound for the primary sequence generated by the optimized gradient method. We then discuss additional convergence properties of the optimized gradient method, including the interesting fact that the optimized gradient method has two types of worst-case functions: a piecewise affine-quadratic function and a quadratic function. These results help complete the theory of an optimal first-order method for smooth convex minimization.-
dc.languageEnglish-
dc.publisherSPRINGER/PLENUM PUBLISHERS-
dc.subjectMINIMIZATION-
dc.titleOn the Convergence Analysis of the Optimized Gradient Method-
dc.typeArticle-
dc.identifier.wosid000392331600011-
dc.identifier.scopusid2-s2.0-84990831625-
dc.type.rimsART-
dc.citation.volume172-
dc.citation.issue1-
dc.citation.beginningpage187-
dc.citation.endingpage205-
dc.citation.publicationnameJOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS-
dc.identifier.doi10.1007/s10957-016-1018-7-
dc.contributor.localauthorKim, Donghwan-
dc.contributor.nonIdAuthorFessler, Jeffrey A.-
dc.description.isOpenAccessN-
dc.type.journalArticleArticle-
dc.subject.keywordAuthorFirst-order algorithms-
dc.subject.keywordAuthorOptimized gradient method-
dc.subject.keywordAuthorConvergence bound-
dc.subject.keywordAuthorSmooth convex minimization-
dc.subject.keywordAuthorWorst-case performance analysis-
dc.subject.keywordPlusMINIMIZATION-
Appears in Collection
MA-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 18 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0