DC Field | Value | Language |
---|---|---|
dc.contributor.author | Kim, Donghwan | ko |
dc.contributor.author | Fessler, Jeffrey A. | ko |
dc.date.accessioned | 2018-09-18T05:53:26Z | - |
dc.date.available | 2018-09-18T05:53:26Z | - |
dc.date.created | 2018-08-21 | - |
dc.date.created | 2018-08-21 | - |
dc.date.issued | 2018 | - |
dc.identifier.citation | SIAM JOURNAL ON OPTIMIZATION, v.28, no.2, pp.1920 - 1950 | - |
dc.identifier.issn | 1052-6234 | - |
dc.identifier.uri | http://hdl.handle.net/10203/245441 | - |
dc.description.abstract | This paper generalizes the optimized gradient method (OGM) [Y. Drori and M. Teboulle, Math. Program., 145 (2014), pp. 451-482], [D. Kim and J. A. Fessler, Math. Program., 159 (2016), pp. 81-107], [D. Kim and J. A. Fessler, J. Optim. Theory Appl., 172 (2017), pp. 187205] that achieves the optimal worst-case cost function bound of first-order methods for smooth convex minimization [Y. Drori, J. Complexity, 39 (2017), pp. 1-16]. Specifically, this paper studies a generalized formulation of OGM and analyzes its worst-case rates in terms of both the function value and the norm of the function gradient. This paper also develops a new algorithm called OGM-OG that is in the generalized family of OGM and that has the best known analytical worst-case bound with rate O(1/N-1.5) on the decrease of the gradient norm among fixed-step first-order methods. This paper also proves that Nesterov's fast gradient method [Y. Nesterov, Dokl. Akad. Nauk. USSR, 269 (1983), pp. 543-547], [Y. Nesterov, Math. Program., 103 (2005), pp. 127-152] has an O(1/N-1.5) worst-case gradient norm rate but with constant larger than OGM-OG. The proof is based on the worst-case analysis called the Performance Estimation Problem in [Y. Drori and M. Teboulle, Math. Program., 145 (2014), pp. 451-482]. | - |
dc.language | English | - |
dc.publisher | SIAM PUBLICATIONS | - |
dc.subject | ITERATIVE SHRINKAGE/THRESHOLDING ALGORITHM | - |
dc.subject | WORST-CASE PERFORMANCE | - |
dc.subject | 1ST-ORDER METHODS | - |
dc.subject | CONVERGENCE | - |
dc.title | GENERALIZING THE OPTIMIZED GRADIENT METHOD FOR SMOOTH CONVEX MINIMIZATION | - |
dc.type | Article | - |
dc.identifier.wosid | 000436991600038 | - |
dc.identifier.scopusid | 2-s2.0-85049693776 | - |
dc.type.rims | ART | - |
dc.citation.volume | 28 | - |
dc.citation.issue | 2 | - |
dc.citation.beginningpage | 1920 | - |
dc.citation.endingpage | 1950 | - |
dc.citation.publicationname | SIAM JOURNAL ON OPTIMIZATION | - |
dc.identifier.doi | 10.1137/17M112124X | - |
dc.contributor.localauthor | Kim, Donghwan | - |
dc.contributor.nonIdAuthor | Fessler, Jeffrey A. | - |
dc.description.isOpenAccess | N | - |
dc.type.journalArticle | Article | - |
dc.subject.keywordAuthor | first-order algorithms | - |
dc.subject.keywordAuthor | gradient methods | - |
dc.subject.keywordAuthor | smooth convex minimization | - |
dc.subject.keywordAuthor | worst-case performance analysis | - |
dc.subject.keywordPlus | ITERATIVE SHRINKAGE/THRESHOLDING ALGORITHM | - |
dc.subject.keywordPlus | WORST-CASE PERFORMANCE | - |
dc.subject.keywordPlus | 1ST-ORDER METHODS | - |
dc.subject.keywordPlus | CONVERGENCE | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.