GENERALIZING THE OPTIMIZED GRADIENT METHOD FOR SMOOTH CONVEX MINIMIZATION

Cited 4 time in webofscience Cited 0 time in scopus
  • Hit : 164
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorKim, Donghwanko
dc.contributor.authorFessler, Jeffrey A.ko
dc.date.accessioned2018-09-18T05:53:26Z-
dc.date.available2018-09-18T05:53:26Z-
dc.date.created2018-08-21-
dc.date.created2018-08-21-
dc.date.issued2018-
dc.identifier.citationSIAM JOURNAL ON OPTIMIZATION, v.28, no.2, pp.1920 - 1950-
dc.identifier.issn1052-6234-
dc.identifier.urihttp://hdl.handle.net/10203/245441-
dc.description.abstractThis paper generalizes the optimized gradient method (OGM) [Y. Drori and M. Teboulle, Math. Program., 145 (2014), pp. 451-482], [D. Kim and J. A. Fessler, Math. Program., 159 (2016), pp. 81-107], [D. Kim and J. A. Fessler, J. Optim. Theory Appl., 172 (2017), pp. 187205] that achieves the optimal worst-case cost function bound of first-order methods for smooth convex minimization [Y. Drori, J. Complexity, 39 (2017), pp. 1-16]. Specifically, this paper studies a generalized formulation of OGM and analyzes its worst-case rates in terms of both the function value and the norm of the function gradient. This paper also develops a new algorithm called OGM-OG that is in the generalized family of OGM and that has the best known analytical worst-case bound with rate O(1/N-1.5) on the decrease of the gradient norm among fixed-step first-order methods. This paper also proves that Nesterov's fast gradient method [Y. Nesterov, Dokl. Akad. Nauk. USSR, 269 (1983), pp. 543-547], [Y. Nesterov, Math. Program., 103 (2005), pp. 127-152] has an O(1/N-1.5) worst-case gradient norm rate but with constant larger than OGM-OG. The proof is based on the worst-case analysis called the Performance Estimation Problem in [Y. Drori and M. Teboulle, Math. Program., 145 (2014), pp. 451-482].-
dc.languageEnglish-
dc.publisherSIAM PUBLICATIONS-
dc.subjectITERATIVE SHRINKAGE/THRESHOLDING ALGORITHM-
dc.subjectWORST-CASE PERFORMANCE-
dc.subject1ST-ORDER METHODS-
dc.subjectCONVERGENCE-
dc.titleGENERALIZING THE OPTIMIZED GRADIENT METHOD FOR SMOOTH CONVEX MINIMIZATION-
dc.typeArticle-
dc.identifier.wosid000436991600038-
dc.identifier.scopusid2-s2.0-85049693776-
dc.type.rimsART-
dc.citation.volume28-
dc.citation.issue2-
dc.citation.beginningpage1920-
dc.citation.endingpage1950-
dc.citation.publicationnameSIAM JOURNAL ON OPTIMIZATION-
dc.identifier.doi10.1137/17M112124X-
dc.contributor.localauthorKim, Donghwan-
dc.contributor.nonIdAuthorFessler, Jeffrey A.-
dc.description.isOpenAccessN-
dc.type.journalArticleArticle-
dc.subject.keywordAuthorfirst-order algorithms-
dc.subject.keywordAuthorgradient methods-
dc.subject.keywordAuthorsmooth convex minimization-
dc.subject.keywordAuthorworst-case performance analysis-
dc.subject.keywordPlusITERATIVE SHRINKAGE/THRESHOLDING ALGORITHM-
dc.subject.keywordPlusWORST-CASE PERFORMANCE-
dc.subject.keywordPlus1ST-ORDER METHODS-
dc.subject.keywordPlusCONVERGENCE-
Appears in Collection
MA-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 4 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0