Overview Statistic: PDF-Downloads (blue) and Frontdoor-Views (gray)

The gradient projection method: Is the Polyak adaptive stepsize rule optimal?

under review
  • Not always! This is our answer to the question of whether the Polyak adaptive stepsize rule in the gradient projection method is optimal. The answer is based on revisiting the subgradient projection method by Polyak [USSR Computational Mathematics and Mathematical Physics 9 (1969)] for smooth and convex minimization problems where the objective function possesses a geometric property called flatness. Our results show that the method can be more flexible (the effective range for the parameter controlling the stepsize can be wider) and have sharper convergence rates. Applications to split feasibility/equality problems are presented, deriving for the first time the O(1/k) rate of convergence for the adaptive CQ method. A theoretical guarantee of the linear convergence of the gradient descent method with adaptive stepsizes for Google PageRank is provided. At the same time, numerical experiments are designed to spot the ``optimal" stepsize and to compare with other basic gradient methods.
Metadaten
Author:Thi Huong VuORCiD, Thorsten KochORCiD, Hong-Kun XuORCiD
Document Type:Other
Year of first publication:2024
Accept ✔
Diese Webseite verwendet technisch erforderliche Session-Cookies. Durch die weitere Nutzung der Webseite stimmen Sie diesem zu. Unsere Datenschutzerklärung finden Sie hier.