A Globally Convergence Spectral Conjugate Gradient Method for Solving Unconstrained Optimization Problems

In this paper, a modified spectral conjugate gradient method for solving unconstrained optimization problems is studied, which has sufficient descent direction and global convergence with an inexact line searches. The Fletcher-Reeves restarting criterion was employed to the standard and new versions and gave dramatic savings in the computational time. The Numerical results show that the proposed method is effective by comparing it with the FR-method.

denote the gradient of f at , x and 0 x be an arbitrary initial approximate solution of ). 1 ( Then, in a standard FR conjugate gradient algorithm, the search direction is determined by Hence, a sequence of solutions will be generated by is the gradient of f evaluated at the current iterate k x [1][2][3][4]. In [5] In this paper, we are going to develop a new conjugate gradient (CG) algorithm. The search direction generated by the method at each iteration satisfies the sufficient descent condition. We are also going to establish the global convergence of the proposed algorithm with the Wolfe-type line search.

A New Conjugate Gradient Algorithm
If exact line search is used, the new method is identical to the MLV method. The new conjugate gradient is as follows : is a parameter. We call the methods ) 1 ( and ) 7 ( with BMLV k k   = as the BMLV method. Now, we present concrete algorithm as follows :

The Algorithm has the following steps :
Step 0 : Given parameters Step 1 : Computing then stop ; else continue .
Step 2 : Set Step 3 : Set Step 4 : Set , (Use strong Wolfe line search technique to compute the parameter k  ) Step 5 : Compute , Step 6 : If n k = go to step (2) with new values of 1 + k x and 1 + k g . If not continue.

Global Convergence
In this section, we study the global convergence of Algorithm (2.1). For this, Firstly, we are going to verify that Algorithm (2.1) is well defined. For the proof of global convergence, the following assumptions 1 are needed.
ii-In some neighborhood U and is continuously differentiable and its gradient is Lipschitz continuous, namely, there exists a constant

Proof.
Firstly, for

Proof :
If our conclusion does not hold, then there exists a real number of 0 1   such that 1   Which is contrary to Theorem (3.2). The proof is complete.

Numerical Results
In this section, we reported some numerical results obtained with the implementation of the new algorithm on a set of unconstrained optimization test problems. We have selected (9)  . The programs were written in Fortran 90. The test functions were commonly used for unconstrained test problems with standard starting points and a summary of the results of these test functions was given in Table (3.1). We tabulate for comparison of these algorithms, the number of function evaluations (NOF) and the number of iterations (NOI) .

5.Conclusions and Discussions.
In this paper, we have proposed modified spectral CG method for solving unconstrained minimization problems. The computational experiments show that the new approaches given in this paper are successful.