CONVERGENCE RATES OF INERTIAL FORWARD-BACKWARD ALGORITHMS

Affiliation auteursAffiliation ok
TitreCONVERGENCE RATES OF INERTIAL FORWARD-BACKWARD ALGORITHMS
Type de publicationJournal Article
Year of Publication2018
AuteursAttouch H, Cabot A
JournalSIAM JOURNAL ON OPTIMIZATION
Volume28
Pagination849-874
Type of ArticleArticle
ISSN1052-6234
Mots-clésaccelerated gradient method, FISTA, Inertial forward-backward algorithms, Nesterov method, proximal-based methods, structured convex optimization, vanishing damping
Résumé

In a Hilbert space H, assuming (alpha(kappa)) a general sequence of nonnegative numbers, we analyze the convergence properties of the inertial forward-backward algorithm (IFB) { y(k) = x(k) + alpha(k)(x(k)-x(k-1)), x(k+1) = prox(s Psi)(y(k)-s del Phi(y(k))), where Psi : H -> R boolean OR{+infinity} is a proper lower-semicontinuous convex function, and Phi : H -> R is a differentiable convex function, whose gradient is Lipschitz continuous. Various options for the sequence (alpha(kappa)) are considered in the literature. Among them, the Nesterov choice leads to the FISTA algorithm and accelerates convergence from O(1/k) to O(1/k(2)) for the values. Several variants are used to guarantee the convergence of the iterates or to improve the rate of convergence for the values. For the design of fast optimization methods, the tuning of the sequence (alpha(kappa)) is a subtle issue, which we deal with in this paper in general. We show that the convergence rate of the algorithm can be obtained simply by analyzing the sequence of positive real numbers (alpha(kappa)). In addition to the case alpha(kappa) = 1 - alpha/k with alpha >= 3, our results apply equally well to alpha(kappa) = 1 - alpha/kappa(tau), with an exponent 0 < r < 1, and to Polyak's heavy ball method. Thus, we unify most of the existing results based on the accelerated gradient method of Nesterov. In the process, we improve some of them and discover new ones.

DOI10.1137/17M1114739