Keywords : unconstrained Optimization


A New Formula for Conjugate Gradient in Unconstrained Optimization

Hussein A. Wali; Khalil K. Abbo

AL-Rafidain Journal of Computer Sciences and Mathematics, 2020, Volume 14, Issue 1, Pages 41-52
DOI: 10.33899/csmj.2020.164798

The conjugate gradient method is an important part of the methods of optimization that are not constrained by local convergence characteristics. In this research, a new formula for the conjugated coefficient is derived depending on the linear structure. The new method fulfills the regression requirement. In addition, using the Wolff search line terms, the overall convergence of the new method has been demonstrated. At the end of the research were presented numerical results that show the effectiveness of the proposed method.
 

New Scaled Proposed formulas For Conjugate Gradient Methods in Unconstrained Optimization

Abbas Y. Al-Bayati; Marwan S. Jameel

AL-Rafidain Journal of Computer Sciences and Mathematics, 2014, Volume 11, Issue 2, Pages 25-46
DOI: 10.33899/csmj.2014.163748

In this paper, three efficient Scaled Nonlinear Conjugate Gradient (CG) methods for solving unconstrained optimization problems are proposed. These algorithms are implemented with inexact line searches (ILS). Powell restarting criterion is applied to all these algorithms and gives dramatic saving in the computational efficiency. The global convergence results of these algorithms are established under the Strong Wolfe line search condition. Numerical results show that our proposed CG-algorithms are efficient and stationary by comparing with standard Fletcher-Reeves (FR); Polak-Ribiere (PR) CG-algorithms, using 35-nonlinear test functions.
 

New Conjugacy Coefficient for Conjugate Gradient Method for Unconstrained Optimization

Hamsa TH. Chilmeran; Huda Y. Najm

AL-Rafidain Journal of Computer Sciences and Mathematics, 2013, Volume 10, Issue 2, Pages 33-46
DOI: 10.33899/csmj.2013.163473

In this paper, we derived a new conjugacy coefficient of conjugate gradient method which is based on non-linear function using inexact line searches. This method satisfied sufficient descent condition and the converges globally is provided. The numerical results indicate that the new approach yields very effective depending on number of iterations and number of functions evaluation .
 

A New Preconditioned Inexact Line-Search Technique for Unconstrained Optimization

Abbas Y. Al-Bayati; Ivan S. Latif

AL-Rafidain Journal of Computer Sciences and Mathematics, 2012, Volume 9, Issue 2, Pages 25-39
DOI: 10.33899/csmj.2012.163698

In this paper, we study the global convergence properties of the new class of preconditioned conjugate gradient descent algorithm, when applied to convex objective non-linear unconstrained optimization functions.
We assume that a new inexact line search rule which is similar to the Armijo line-search rule is used. It's an estimation formula to choose a large step-size at each iteration and use the same formula to find the direction search. A new preconditioned conjugate gradient direction search is used to replace the conjugate gradient descent direction of ZIR-algorithm. Numerical results on twenty five well-know test functions with various dimensions show that the new inexact line-search and the new preconditioned conjugate gradient search directions are efficient for solving unconstrained nonlinear optimization problem in many situations.
 

New Variable Metric Algorithm by The Mean of 2nd Order Quasi-Newton Condition

Abbas Y. Al-Bayati; Runak M. Abdullah

AL-Rafidain Journal of Computer Sciences and Mathematics, 2011, Volume 8, Issue 2, Pages 35-41
DOI: 10.33899/csmj.2011.163639

In this paper a new class of Quasi-Newton update for solving unconstrained nonlinear optimization problem is proposed. In this work we suggested a new formula for the variable metric update with  a new quasi-Newton condition used for the symmetric rank two formula.
Finally, a numerical study is reported in which the performance of this new algorithm is compared to that of various members of the unmodified family. Numerical experiments indicate that this new algorithm is effective and superior to the standard BFGS and DFP algorithms, with respect to the number of functions evaluations (NOF) and number of iterations (NOI).
 

A New Symmetric Rank One Algorithm for Unconstrained Optimization

Abbas Y. Al-Bayati; Salah G. Shareef

AL-Rafidain Journal of Computer Sciences and Mathematics, 2011, Volume 8, Issue 2, Pages 13-19
DOI: 10.33899/csmj.2011.163637

In this paper, a new symmetric rank one for unconstrained optimization problems is presented. This new algorithm is used to solve symmetric and positive definite matrix. The new method is tested numerically by (7) nonlinear test functions and method is compared with the standard BFGS algorithm.
            The new matrix used is symmetric and positive definite and it generates descent directions and satisfied QN-like condition.
 

Parallel Direct Search Methods

Bashir M. Khalaf; Mohammed W. Al-Neama

AL-Rafidain Journal of Computer Sciences and Mathematics, 2010, Volume 7, Issue 3, Pages 51-60
DOI: 10.33899/csmj.2010.163909

Mostly minimization or maximization of a function is very expensive. Since function evaluation of the objective function requires a considerable time. Hence, our objective in this work is the development of parallel algorithms for minimization of objective functions evaluation takes long computing time. The base of the developed parallel algorithms is the evaluation of the objective function at various points in same time (i.e. simultaneously).
We consider in this work the parallelization of the direct search methods, as these methods are non-sensitive for noise and globally convergent. We have developed two algorithms mainly they are dependent on the Hock & Jeff method in unconstrtrained optimization.
The developed parallel algorithm are suitable for running on MIMD machine which are consisting of several processors operating independently, each processor has it's own memory and communicating with each other through a suitable network.
 
 

An Efficient Line Search Algorithm for Large Scale Optimization

Abbas Y. Al-Bayati; Ivan S. Latif

AL-Rafidain Journal of Computer Sciences and Mathematics, 2010, Volume 7, Issue 1, Pages 35-49
DOI: 10.33899/csmj.2010.163845

In this work we present a new algorithm of gradient descent type, in which the stepsize is computed by means of simple approximation of the Hessian Matrix to solve nonlinear unconstrained optimization function. The new proposed algorithm considers a new approximation of the Hessian based on the function values and its gradients in two successive points along the iterations one of them use Biggs modified formula to locate the new points. The corresponding algorithm belongs to the same class of superlinear convergent descent algorithms and it has been newly programmed to obtain the numerical results for a selected class of nonlinear test functions with various dimensions. Numerical experiments show that the new choice of the step-length required less computation work and greatly speeded up the convergence of the gradient algorithm especially, for large scaled unconstrained optimization problems.
 
 

New conjugacy condition with pair-conjugate gradient methods for unconstrained optimization

Abbas Y. Al-Bayati; Huda I. Ahmed

AL-Rafidain Journal of Computer Sciences and Mathematics, 2009, Volume 6, Issue 3, Pages 21-35
DOI: 10.33899/csmj.2009.163818

Conjugate gradient methods are wildly used for unconstrained optimization especially when the dimension is large. In this paper we propose a new kind of nonlinear conjugate gradient methods which on the study of Dai and Liao (2001), the new idea is how to use the pair conjugate gradient method with this study (new cojugacy condition) which consider an inexact line search scheme but reduce to the old one if the line search is exact. Convergence analysis for this new method is provided. Our numerical results show that this new methods is very efficient for the given ten test function compared with other methods.
 

A New Restarting Criterion for FR-CG Method with Exact and Inexact Line Searches

Maha S. Younis

AL-Rafidain Journal of Computer Sciences and Mathematics, 2008, Volume 5, Issue 2, Pages 95-110
DOI: 10.33899/csmj.2008.163975

A new restarting criterion for FR-CG method is derived and investigated in this paper. This criterion is globally convergent whenever the line search fulfills the Wolfe conditions. Our numerical tests and comparisons with the standard FR-CG method for large-scale unconstrained optimization are given, showining significantly improvements.
 

New Secant Hyperbolic Model for Conjugate Gradient Method

Abbas Y. Al-Bayati; Ban Ahmed Mitras

AL-Rafidain Journal of Computer Sciences and Mathematics, 2008, Volume 5, Issue 2, Pages 11-18
DOI: 10.33899/csmj.2008.163967

New hyperbolic model different from the quadratic ones is proposed for solving unconstrained optimization problems which modify the classical conjugate gradient method. This new model was compared with established methods over a variety of standard non-linear test functions. The numerical results show that the use of non-quadratic model is beneficial in most of the problems considered especially when the dimensional of the problems increases.
 

A New Family of Spectral CG-Algorithm

Abbas Y. Al-Bayati; Runak M. Abdullah

AL-Rafidain Journal of Computer Sciences and Mathematics, 2008, Volume 5, Issue 1, Pages 69-80
DOI: 10.33899/csmj.2008.163950

A new family of  CG –algorithms for large-scale unconstrained optimization is introduced in this paper using the spectral scaling for the search directions, which is a generalization of the spectral gradient method proposed by Raydan [14].
Two modifications of the method are presented, one using Barzilai line search, and the others  take  at each iteration (where  is step- size). In both cases tested for the  Wolfe conditions, eleven test problems with different dimensions are used to compare these algorithms against the  well-known Fletcher –Revees CG-method, with obtaining a robust numerical results.
 

Investigation on Scaled CG-Type Algorithms for Unconstrained Optimization

Abbas Y. Al-Bayati; Khalil K. Abo; Salah G. Shareef

AL-Rafidain Journal of Computer Sciences and Mathematics, 2007, Volume 4, Issue 2, Pages 11-23
DOI: 10.33899/csmj.2007.164012

In this paper, we describe two new algorithms which are modifications of the Hestens-stiefl CG-method. The first is the scaled CG-method (obtained from function and gradient-values) which improves the search direction by multiplying to a scalar obtained from function value and its gradient at two successive points along the iterations. The second is the Preconditioned CG-method which uses an approximation at Hessein of the minimizing function. These algorithms are not sensitive to the line searches. Numerical experiments indicate that these new algorithms are effective and superior especially for increasing dimensionalities.
 

On Self-Scaling Variable-Metric Algorithms

Abbas Y. Al-Bayati; Mardin Sh. Taher

AL-Rafidain Journal of Computer Sciences and Mathematics, 2007, Volume 4, Issue 1, Pages 11-18
DOI: 10.33899/csmj.2007.163992

In this paper, we have developed a new self-scaling VM-method for solving unconstrained nonlinear optimization problems. The numerical and theoretical results demonstrate the general effectiveness of the new self-scaling VM-method when compared with  PHUA  &  ZENG  algorithm ; we have tested these algorithms on several high-dimension test functions with promising numerical results.
 

Enriched Algorithms for Large Scale Unconstrained Optimization

Abbas Y. Al-Bayati; Omar B. Mohammad

AL-Rafidain Journal of Computer Sciences and Mathematics, 2007, Volume 4, Issue 1, Pages 11-37
DOI: 10.33899/csmj.2007.164001

A new method for solving Large-Scale problems in the unconstrained optimization has been proposed in this research depending on the BFGS method.
The limited memory is used in the BFGS method by multiplying the BFGS matrix by a vector to obtain vectors instead of matrices and only two vectors can be stored, by modifying the algorithm given by Nocedal J (1999).
The purpose of this algorithm is to enable us to solving the Large-Scale Problems, as it is obvious to everyone that the computer can store millions of vectors, whereas its ability in storing matrices is limited.
The present method in this research is applied on seven nonlinear functions in order to evaluate the method efficiency in the numbers of iterations (NOI), number of functions (NOF) and function value and comparing it with the standard BFGS method after updating.
This method has been applied on functions with variables till 1000000 and more than that.
From comparing the results, we fined that this algorithm
was the best.
 

Modified the CG-Algorithm for Unconstrained Non-Linear Optimization by Using Oren’s Update

Abbas Y. Al-Bayati; Abdulghafor M. Al-Rozbayani

AL-Rafidain Journal of Computer Sciences and Mathematics, 2005, Volume 2, Issue 2, Pages 11-19
DOI: 10.33899/csmj.2005.164078

In this paper we have modified a new extended generalized conjugate gradient steps with self-scaling variable metric updates for unconstrained optimization. The new proposed algorithm is based on the inexact line searches and it is examined by using different non-linear test functions with various dimensions.
 

Investigation on Self-Scaling of Mixed Interior-Exterior Method for Non-Linear Optimization

Ghada M. Al-Naemi

AL-Rafidain Journal of Computer Sciences and Mathematics, 2005, Volume 2, Issue 2, Pages 73-85
DOI: 10.33899/csmj.2005.164085

In this paper, we have investigated self-scaling sequential unconstrained minimization techniques  (SUMT). Our new modified version on CG-method and QN- method shows that it is too effective when compared with other established  algorithms to solve standard constrained optimization problems.
 

New Hybrid CG Algorithm Based on PR and FR Steps

Abbas Y. Al-Bayati; Khalil K. Abbo; Asma M. Abdalah

AL-Rafidain Journal of Computer Sciences and Mathematics, 2005, Volume 2, Issue 1, Pages 27-38
DOI: 10.33899/csmj.2005.164065

In this paper, a new hybrid conjugate gradient algorithm is proposed for unconstrained optimization. This algorithm combines the desirable computation aspects of Polak-Ribière steps and useful theoretical features of Fletcher-Reeves CG-steps. Computational results for this algorithm are given and compared with those of the Fletcher and Polak standard CG methods showing a considerable improvement over the latter two methods.