Two Modified QN-Algorithms for Solving Unconstrained Optimization Problems

This paper presents two modified Quasi-Newton algorithms which are designed for solving nonlinear unconstrained optimization problems. These algorithms are based on different techniques namely: Quasi-Newton conditions on quadratic and nonquadratic objective functions. Experimental results indicate that the new proposed algorithms are more efficient than the Yuan and Biggsalgorithms.


Introduction.
Variable Metric or precisely Quasi-Newton (QN)-algorithms are used to solve a class of numerical methods of the following unconstrained optimization problem : where f is a smooth function of n variables [10]. We recall that these types of methods are iterative. Starting with an initial point , , k g is the gradient of f evaluated at the current iterate k x [9]. The search direction is calculated by : is a symmetric positive definite matrix and satisfying the QN-equation , see [9,10]. The search direction k d in (5) is the solution of the following quadratic sub problem , 2 which is an approximation to problem (1) near the current iterate In fact, the definition of (.) and condition ) 6 ( is equivalent to [4] introduced 'conic models' where a non-quadratic more details can be found in [10]. In section (2), a modified Biggs's [1] and [2] update and Yuan's [10] update which are based on the simple idea of approximation the objective function by different techniques are induced. Finally, in section (3) numerical results with a brief discussion are presented .

Two Modified QN-Methods.
The BFGS algorithm for unconstrained optimization problem (1) uses the search direction (5), and the matrices k B are updated by the BFGS formula as : which satisfied the QN equation (6).
The BFGS method is one of the most efficient methods for solving the unconstrained optimization problem (1). More details can be found in Fletcher [5].
In [7] and [10], approximate function ) (d k  in (7) is required to satisfy the interpolation condition (11) instead of (10) This change was inspired from the fact that for one dimensional problem, using (11) gives a slightly faster local convergence if we assume 1 = k  for all .
k Equation (11) can be rewritten as In order to satisfy (14), the BFGS formula is modified as follows : However, condition (16) may be modified further to give : In [6] and [7] if the objective function f is cubic along the line segment between 1 − k x and k x , then we have the following relation: Instead of (16). Biggs [1] and [ 2] give the update of (17) with the value k t chosen so that (20) holds. The respected value of k t is given by Thus we can obtain another modified parameter from (21) by considering the following relation : ) 22 ( ..........

Two Modified QN-Algorithms
The outline of the modified QN-algorithm is as follows : Step 0 : Choose an initial point n R thus, it is reasonable to hope that local super linear convergence of the BFGS algorithm can be extended to the modified algorithm where updating formula (17) is used. Details of local analyses of the BFGS algorithm can be found in Dennis and More [3].

Numerical Results.
In this paper, we have proposed two versions of a modified VM-method, for solving unconstrained minimization nonlinear problems. The computational experiments show that the modified approaches given in this paper are successful. We claim that the two modified (1) and (2) are better than the Yuan and Biggs methods. We have selected (8) large scale unconstrained optimization problems in extended or generalized form, for each test function, we have considered numerical experiment with the number of variables, n= 100, 500 and 1000. The programs were written in Fortran 90. The same line search was employed in each algorithm, this was the cubic interpolation technique which satisfies the conditions (3) and (4) (NOI) .  500  6  21  6  21  1000  6  21  6  21  8  100  90  181  90  181  500  117  235  181  237  1000  131  263  130  261 Total 5106 13710 5097 13467