An efficient modification of the Hestenes-Stiefel nonlinear conjugate gradient method with restart property

Abstract The conjugate gradient (CG) method is one of the most popular methods to solve nonlinear unconstrained optimization problems. The Hestenes-Stiefel (HS) CG formula is considered one of the most efficient methods developed in this century. In addition, the HS coefficient is related to the con...

Full description

Bibliographic Details
Main Authors: Zabidin Salleh, Ahmad Alhawarat
Format: Article
Language:English
Published: SpringerOpen 2016-04-01
Series:Journal of Inequalities and Applications
Subjects:
Online Access:http://link.springer.com/article/10.1186/s13660-016-1049-5
Description
Summary:Abstract The conjugate gradient (CG) method is one of the most popular methods to solve nonlinear unconstrained optimization problems. The Hestenes-Stiefel (HS) CG formula is considered one of the most efficient methods developed in this century. In addition, the HS coefficient is related to the conjugacy condition regardless of the line search method used. However, the HS parameter may not satisfy the global convergence properties of the CG method with the Wolfe-Powell line search if the descent condition is not satisfied. In this paper, we use the original HS CG formula with a mild condition to construct a CG method with restart using the negative gradient. The convergence and descent properties with the strong Wolfe-Powell (SWP) and weak Wolfe-Powell (WWP) line searches are established. Using this condition, we guarantee that the HS formula is non-negative, its value is restricted, and the number of restarts is not too high. Numerical computations with the SWP line search and some standard optimization problems demonstrate the robustness and efficiency of the new version of the CG parameter in comparison with the latest and classical CG formulas. An example is used to describe the benefit of using different initial points to obtain different solutions for multimodal optimization functions.
ISSN:1029-242X