PDF (الإنجليزية)

الكلمات المفتاحية

Preconditioned CG
unconstrained Optimization
Self-Scaling VM-update
inexact Line-Search

الملخص

In this paper, we study the global convergence properties of the new class of preconditioned conjugate gradient descent algorithm, when applied to convex objective non-linear unconstrained optimization functions. We assume that a new inexact line search rule which is similar to the Armijo line-search rule is used. It's an estimation formula to choose a large step-size at each iteration and use the same formula to find the direction search. A new preconditioned conjugate gradient direction search is used to replace the conjugate gradient descent direction of ZIR-algorithm. Numerical results on twenty five well-know test functions with various dimensions show that the new inexact line-search and the new preconditioned conjugate gradient search directions are efficient for solving unconstrained nonlinear optimization problem in many situations.  
https://doi.org/10.33899/csmj.2012.163698
  PDF (الإنجليزية)