In Accurate CG-Algorithm for Unconstrained Optimization Problems

An algorithm for unconstrained minimization is proposed which is invariant to a non-linear scaling of a strictly convex quadratic function and which generates mutually conjugate directions for extended quadratic function. It is derived for inexact line searches and is designed for general use, it compares favorably numerical tests [over eight test functions and dimensionally up to (2-100) with the H/S, DX, F/R, P/R, and A/B algorithms on which this new algorithm is based. Keyword: CG algorithm, in accurate CG method. ةيمزراوخ CG ةديقملا ريغ ةيلثملأا لئاسمل ةسيقملا يدسلاا لاضن زيزع الله لام نوسيم سا رتم دمحا ناب لصوملا ةعماج ،تايضايرلاو بوساحلا مولع ةيلك :ملاتسلاا خيرات 13 / 05 / 2002 :لوبقلا خيرات 01 / 09 / 2002 صــخلملا ل ةييي م او ز يييةخ زيييخ ميييح لا احييي ذيييف ييي ا ذييي لا و ا ييياةملا ريييا ةييي لثملا ةيع وم ةةفا ر م تا اجخلا ممعو ا شب ةب حملا ةيعيبر لا لاو لل ذطخلا را سياة لاب N.H. ALAssady & Maysoon M .Aziz & B. A. Metras 35 ما خ ييسلال ممييصو مايي لا رييا مييح لا اييخل ةحيييص ة يي او قةيي عيبر لا ةييلا لا ايييسو ل ي ا ا لاو ذنايمثل ةةبايبلا ايقا رطلا ايم ةةيرطلا هح ةن اةم مخ امك قماعلا ايعبأبو ه ( 2 100 ييص ايم ) H/S, DX, F/R, P/R. A/B ةي م اوخلا لقاي ن يناك يقو ق ةيلصلأا تايم اوخلا نم "اءافك رثكا ا ي جلا . :ةيحاتفملا تاملكلا ،افا ر ملا ج لا ةيم او سياة لا تايم او ق


Introduction:
The searches related with Conjugate Gradient method begin to apply inexact line searches; i.e not to iterate until the "line minima" is found to some predetermined (small) tolerance in order to reduce the number of function evaluations (NOF).
In order to show improvement of the local rate of convergence and the efficiency of the traditional CG-method several well-known methods are discussed later, namely Dixon (1975) and Nazareth (1977).
The type of these algorithms has quadratic termination property by using an error vector even if inexact searches are used, while Sloboda (1982) presented an algorithm which retains the quadratic termination property without using an error vectors.
In this paper we develop a new general way of the CG algorithm with inexact line searches. This new algorithm is similar to that derived by Dixon (1975) CG method with inexact line searches but does not require the correction term as extra vector storage and different formula is obtained.

General Background:
In this section we shall present a brief description of the Dixon Sloboda, Nazareth and Nocedal algorithms.We shall then discuss properties of these algorithms.

The Dixon's Method (1975):
In Dixon's method the idea is to determine directions parallel to the CG directions (for a quadratic) without line searches, and so he develops a conjugate set. Explicitly, the search directions d k, k= 1, 2, …, are given by are estimation values defined below. The new iterate is defined as: where  k is chosen simply to satisfy conditions defined in the line search subprogram. The gradient − g k is estimate of the gradients at the points x k that we would have reached, if we had performed exact line searches along the d k . for a quadratic, these gradients can be evaluated exactly, and defined from: After n steps the error vector used to find the minimum of quadratic.
Dixon's method moves along a direction parallel to the CG direction, and so retains the property of the quadratic convergence. (Dixon, 1975).

The Sloboda Method (1982):
Sloboda developed an algorithm which generates conjugate direction with imperfect searches and has the quadratic termination property without using an error vector. The algorithm for general function is as follows: The Outline of the SLO-Algorithm: Step ( Step (4) : If k=n+1 then set k=0 and go to step (1), else compute set k=k+1 , and go to step (3)

The Multi-Step Method, Nazareth and Nocedal (1978):
Nazareth and Nocedal show that with inexact line searches a natural extension of conjugate gradient method, the algorithm can be obtained of this method is called NAZ-NOC.
This algorithm is considered as a modification of Gram -Schmidt orthogonalization process, where Nazareth and Nocedal show that not all the coefficients of the Gram-Schmidt process must be computed at every iteration , then suggested the following algorithm : Step (1) : Set x 1 , d 1 =-g 1 , e 1 = 0 Step ( Step (3) : Check for convergence . If   +1 k g then stop, else go to step (4) Step (4) : if k<n , set k=k+1 , compute , go to step (2) , else set k=1 and go to step (1)

39
The algorithm will have quadratic termination property when using the error term e k+1 in the extra step, even if implemented without line search.

The New CG -Method with Inexact Line Searches:
In this section a new general way for the conjugate gradient type methods is presented. This new approach has the property of the quadratic termination even if the line search is not exact and the extra correction term is not essential. Now let the g k be the gradient of the quadratic function and let Where y k = g k+1g k , then the following lemma is hold.

Lemma(1):
For the quadratic function, the term 1 k * g + which is defined in eq.(1) is equivalent to that the gradient g k+1 which is obtained by Hestenes and Stiefel method in (1952).

Proof:
Let We Let's define g k =Gx k -b , and d 0 = -g 0 where g k is the gradient of quadratic function. The biothogonlization process of Hestenes and Stiefel is as follows: for exact line searches and G is a symmetric positive define matrix . Thus and , in order to prove that g k+1 which is obtained by Hestenes and Stiefel exact line searches algorithm is identical to the term 1 k * g + which is defined in eq.(1), we proceed as follows: ,then rewriting eq.(2), we get Now multiplying and dividing the second term of the eq. (3) by , it becomes as: from the definition we have y k = g k+1 -g k =  k G d k , then replacing y k instead of  k G d k in the above equation then we get: y k , thus this equation is identical to the equation which is defined in eq.(1) as: thus the is of vectors , ,......... g is orthogonal as in Hestenes and Stiefel method in quadratic function . From the above argument we have the following two corollaries‫ق‬

Corollary (1):
The term 1 k * g + which is defined in eq.(1) is used to be obtained by the following form: is parallel to that search direction given by Hestenes and Stiefel algorithm for quadratic function.

Corollary (2):
The search direction which is defined in eq. (4)

Generally function is reduced P/R and A/B algorithms even it
Inexact searches can be used as it well be shown in Corollary (2).

Numerical Results:
Several standard test functions were minimized to compare the new algorithm with standard CG algorithms. The same line search was employed in each of the algorithms. The cubic interpolation, and the algorithms were terminated if the norm of the gradient was reduced below 5 10 1 −  ‫ق‬We tabulate for all the algorithms the number of calls of the function evaluation (NOF), and the number of iterations (NOI). Overall totals are also given for NOF and NOI with each algorithm.   (4) out of (12) cases and in (5) cases they are comparable.   (6) out of (12) cases.
In table (4) we have compared the new algorithm with (P/R) formula with standard CG (P/R). It is obvious that the new algorithm improves the standard (P/R) algorithm in about (80.2%) NOI and (79.6%) NOF.
In table (5) we represent a numerical example to show that the performance of suggested algorithm is quick and has better performance since it requires less time to execute.    -1,0,1,…)