الملخص
In this paper, three efficient Scaled Nonlinear Conjugate Gradient (CG) methods for solving unconstrained optimization problems are proposed. These algorithms are implemented with inexact line searches (ILS). Powell restarting criterion is applied to all these algorithms and gives dramatic saving in the computational efficiency. The global convergence results of these algorithms are established under the Strong Wolfe line search condition. Numerical results show that our proposed CG-algorithms are efficient and stationary by comparing with standard Fletcher-Reeves (FR); Polak-Ribiere (PR) CG-algorithms, using 35-nonlinear test functions.