An accelerated forward-backward algorithm with a new linesearch for convex minimization problems and its applications

We study and investigate a convex minimization problem of the sum of two convex functions in the setting of a Hilbert space. By assuming the Lipschitz continuity of the gradient of the function, many optimization methods have been invented, where the stepsizes of those algorithms depend on the Lipsc...

Full description

Bibliographic Details
Main Authors: Adisak Hanjing, Pachara Jailoka, Suthep Suantai
Format: Article
Language:English
Published: AIMS Press 2021-04-01
Series:AIMS Mathematics
Subjects:
Online Access:https://www.aimspress.com/article/doi/10.3934/math.2021363?viewType=HTML
Description
Summary:We study and investigate a convex minimization problem of the sum of two convex functions in the setting of a Hilbert space. By assuming the Lipschitz continuity of the gradient of the function, many optimization methods have been invented, where the stepsizes of those algorithms depend on the Lipschitz constant. However, finding such a Lipschitz constant is not an easy task in general practice. In this work, by using a new modification of the linesearches of Cruz and Nghia [7] and Kankam et al. [14] and an inertial technique, we introduce an accelerated algorithm without any Lipschitz continuity assumption on the gradient. Subsequently, a weak convergence result of the proposed method is established. As applications, we apply and analyze our method for solving an image restoration problem and a regression problem. Numerical experiments show that our method has a higher efficiency than the well-known methods in the literature.
ISSN:2473-6988