Convergence rate of a relaxed inertial proximal algorithm for convex minimization

Affiliation auteursAffiliation ok
TitreConvergence rate of a relaxed inertial proximal algorithm for convex minimization
Type de publicationJournal Article
Year of Publication2020
AuteursAttouch H, Cabot A
JournalOPTIMIZATION
Volume69
Pagination1281-1312
Date PublishedJUN 2
Type of ArticleArticle
ISSN0233-1934
Mots-clésInertial proximal method, Lyapunov analysis, maximally monotone operators, nonsmooth convex minimization, relaxation
Résumé

In a Hilbert space setting, the authors recently introduced a general class of relaxed inertial proximal algorithms that aim to solve monotone inclusions. In this paper, we specialize this study in the case of non-smooth convex minimization problems. We obtain convergence rates for values which have similarities with the results based on the Nesterov accelerated gradient method. The joint adjustment of inertia, relaxation and proximal terms plays a central role. In doing so, we highlight inertial proximal algorithms that converge for general monotone inclusions, and which, in the case of convex minimization, give fast convergence rates of values in the worst case.

DOI10.1080/02331934.2019.1696337