Nonlinear conjugate gradient for smooth convex functions

  • The method of nonlinear conjugate gradients (NCG) is widely used in practice for unconstrained optimization, but it satisfies weak complexity bounds at best when applied to smooth convex functions. In contrast, Nesterov’s accelerated gradient (AG) method is optimal up to constant factors for this class. However, when spe- cialized to quadratic function, conjugate gradient is optimal in a strong sense among function-gradient methods. Therefore, there is seemingly a gap in the menu of avail- able algorithms: NCG, the optimal algorithm for quadratic functions that also exhibits good practical performance for general functions, has poor complexity bounds com- pared to AG. We propose an NCG method called C+AG (“conjugate plus accelerated gradient”) to close this gap, that is, it is optimal for quadratic functions and still satisfies the best possible complexity bound for more general smooth convex functions. It takes conjugate gradient steps until insufficient progress is made, at which time it switches to accelerated gradient steps, and later retries conjugate gradient. The proposed method has the following theoretical properties: (i) It is identical to linear conjugate gradient (and hence terminates finitely) if the objective function is quadratic; (ii) Its running- time bound is O(e−1/2) gradient evaluations for an L-smooth convex function, where � is the desired residual reduction, (iii) Its running-time bound is O(√L/l ln(1/e)) if the function is both L-smooth and � -strongly convex. We also conjecture and outline a proof that a variant of the method has the property: (iv) It is n-step quadratically convergent for a function whose second derivative is smooth and invertible at the optimizer. Note that the bounds in (ii) and (iii) match AG and are the best possible, i.e., they match lower bounds up to constant factors for the classes of functions under consideration. On the other hand, (i) and (iv) match NCG. In computational tests, the function-gradient evaluation count for the C+AG method typically behaves as whichever is better of AG or classical NCG. In some test cases it outperforms both.

Download full text files

Export metadata

Additional Services

Share in Twitter Search Google Scholar
Metadaten
Author:Sahar Karimi, Stephen A. Vavasis
DOI:https://doi.org/10.1007/s12532-024-00254-y
ISSN:1867-2949
Parent Title (English):Mathematical Programming Computation
Publisher:Springer Science and Business Media LLC
Document Type:Article
Language:English
Year of Completion:2024
Volume:16
Issue:2
Page Number:26
First Page:229
Last Page:254
Mathematical Programming Computation :MPC 2024 - Issue 2
Verstanden ✔
Diese Webseite verwendet technisch erforderliche Session-Cookies. Durch die weitere Nutzung der Webseite stimmen Sie diesem zu. Unsere Datenschutzerklärung finden Sie hier.