A New Preconditioned Inexact Line-Search Technique for Unconstrained Optimization

In this paper, we study the global convergence properties of the new class of preconditioned conjugate gradient descent algorithm, when applied to convex objective non-linear unconstrained optimization functions. We assume that a new inexact line search rule which is similar to the Armijo line-searc...

Full description

Bibliographic Details
Main Authors: Abbas Al-Bayati, Ivan Latif
Format: Article
Language:Arabic
Published: Mosul University 2012-12-01
Series:Al-Rafidain Journal of Computer Sciences and Mathematics
Subjects:
Online Access:https://csmj.mosuljournals.com/article_163698_dc3a6ed4fb8e83e125d9c8b352b1d5ec.pdf
Description
Summary:In this paper, we study the global convergence properties of the new class of preconditioned conjugate gradient descent algorithm, when applied to convex objective non-linear unconstrained optimization functions. We assume that a new inexact line search rule which is similar to the Armijo line-search rule is used. It's an estimation formula to choose a large step-size at each iteration and use the same formula to find the direction search. A new preconditioned conjugate gradient direction search is used to replace the conjugate gradient descent direction of ZIR-algorithm. Numerical results on twenty five well-know test functions with various dimensions show that the new inexact line-search and the new preconditioned conjugate gradient search directions are efficient for solving unconstrained nonlinear optimization problem in many situations.
ISSN:1815-4816
2311-7990