A New Preconditioned Inexact Line-Search Technique for Unconstrained Optimization

In this paper, we study the global convergence properties of the new class of preconditioned conjugate gradient descent algorithm, when applied to convex objective non-linear unconstrained optimization functions. We assume that a new inexact line search rule which is similar to the Armijo line-search rule is used. It's an estimation formula to choose a large step-size at each iteration and use the same formula to find the direction search. A new preconditioned conjugate gradient direction search is used to replace the conjugate gradient descent direction of ZIR-algorithm. Numerical results on twenty five well-know test functions with various dimensions show that the new inexact line-search and the new preconditioned conjugate gradient search directions are efficient for solving unconstrained nonlinear optimization problem in many situations.


Introduction
Some important global convergence result for various methods using line-search procedures have been given [1], [4] the above mentioned line search methods are monotone descent for unconstrained optimization [10], [11]. Non monotone linesearches have been investigated also by many authors see [6], [9]. The Barzilai-Borwein method [2], [8] is a non monotone descent method which is an efficient algorithm for solving some special problem, Zirilli [12] extend the Armijo line search rule ant analyze the global convergence of the corresponding method.
In this paper, we extend the Armijo line-search rule so that we can design a new inexact line search technique and we choose the search directions of AL-Bayati Self-Scaling [3] variable metric update which based on two parameter family of rank-two updating formulae. Numerical results show that the new algorithm which enables us to choose large step-size at each iteration and reduce the number of functions. The new algorithm is efficient for solving unconstrained optimization problems.
We consider the following unconstrained optimization problem of n variables, , ), ( is twice continuously differentiable and its gradient g is exist available. We consider iterations of the form where k d is a search direction and k  is the step-length obtained by means of onedimensional search. In conjugate gradient method when the function is quadratic and the line search is exact, another broad class of methods may be defined by the following search direction: where k H is a non singular symmetric matrix. Important special cases are given by Variable Metric (VM) methods are also of the form (3) and in this case k H is not only a function of k x , but depends also on For conjugate gradient method, obtaining descent direction is not easy and requires a careful choice properties of line search methods and it can be studied by measuring the goodness of the search direction and by considering the length of the step. The quality of the angle between the steepest descent direction k g − and the search direction. We can define: The length of the step is determined by the line search iteration. A strategy that will play a central role in this paper is to set scalars k s ,  , L , 0   with: The inequality ensures that the function is reduced sufficiently, we will call these relations as Armijo condition.
has a lower bound on the level set where 0 x is given is Lipschitz continuous in an open convex set B that contains 0 L the; i.e., there exists L such that The modified Armijo line search rule as [1]:

Outlines of the Zir Algorithm:
The implementable inexact line search algorithm is stated as follows [12]: Stepl: Given some parameters,  is a small parameter.
Step2: If   k g then stop. Else go to step3.
Step3: Choose k d , to satisfy the angle property (5) and set k k Step4: Set Step6: Set 1 + = k k and go to step 2.

Some Properties of the Zir Algorithm:
Theorem 2.2.1: Assume that (H1) and (H2) hold, the search direction k d satisfies (4) and k  is determined by the modified Armijo line-search rule. Zir where k m is appositive integer and for the details of the proof see [12].
In fact, Assumption (H2) can be replaced by the following weaker assumption.
is uniformly continuous on an open convex set B that contains 0 L see [9].

A New Proposed Preconditioned Inexact Line-Search Algorithm (New):
In this section we propose a new algorithm which implements the step-size k  with inexact line search rule. This formula is implemented with AL-Bayati self-scaling [3] variable metric update.

Outlines of the New Algorithm:
The outlines of the new proposed Algorithm are stated as follows: Step1: Given some parameters ) 2 Step2: If   k g then stop. Else go to step3.
Step3: Choose k d to satisfy the angle property (5) and satisfy the new search direction.
Step4: Set Step5: Set Step6: Update k H by Step7: If available storage is exceeded then employ a restart option either with n k = or k T k k T k g g g g i.e. orthogonality condition is not satisfy see [7]. Steps: Set 1 + = k k and go to step2.

Some Theoretical Properties of the New Algorithm:
We analyze the global convergence of the proposed new inexact line-search algorithm. For the proof of convergence we adopt the assumptions (H1), (H2') on the function f which is commonly used and we suppose that   k H is a sequence of positive definite matrices. Assume also that there exist parameters and H is positive definite as in Al-Bayati VM-update [3]. We analyze the conjugate gradient algorithm that use the following modified line-search formula: Set scalars The proof is complete. #

Proof:
By the mean value theorem we have this finishes our proof. #

Numerical results:
In this section, we compare the numerical behavior of the new algorithm with the Zir algorithm for different dimensions of test functions. Comparative test were performed with (25) (specified in the Appendices 1 and 2) well-Known test function see [5]. All the results are obtained with newly-programmed FORTRAN routines which employ double precautions. We solve each of these test function by the: 1-Zirlli algorithm (Zir). 2-The new algorithm (New). and for each algorithm we used the following stopping criterion All the numerical results are summarized in Table (l), Table (2) and Table (3). They present the numbers of iterations (NOI) versus the numbers of function evaluations (NOF) that are need to obtain the condition while Table (3) gives the percentage performance of the new algorithm based on both NOI and NOF against the original Zit algorithm. The important thing is that the new algorithm solves each particular problem measured by NOI and NOF respectively, while the other algorithm may fail in some cases. Moreover, the new proposed algorithm always performs more stably and efficiently.
Namely there are about (50-52)% on NOI for all dimensions also there are (63-78)% improvements on NOF for all test functions.

Conclusions:
In this paper, a new PCG-algorithm with a self-scaling VM-update and a new search direction formula is proposed. A modified formula of an inexact line search is implemented to solve a large-scale unconstrained optimization test functions. Our numerical results supports our claim and also indicate that the new algorithm sufficiently decrease the function values and iterations and it needs an extra line search conditions satisfied near the stationary point of the proposed line search procedure.