Three Proposed Hybrid Genetic Algorithms

Genetic Algorithm has been hybridized with classical optimization methods. Hybridization has been done in three approaches, by using conjugate gradient algorithm for Fletcher and Reeves, second by using steepest descent method and lastly by creation of initial population for genetic algorithm from one of conjugate gradient method, the numerical results were encouraging.


Introduction:
Optimization is an important tool to any problem involving decision making in many disciplines, such as engineering, mathematics, statistics, economics, and computer science.Now, more than ever, it is increasingly vital to have a firm grasp of the topic due to the rapid progress in computer technology, including the development and availability of, high-speed and parallel processors, and networks.
Mathematically speaking, optimization is the minimization or maximization of a function either subject to no constraints, so it is called unconstrained optimization which has the mathematical formulation: where n R x  is a real vector with an n ≥ 1 and an R R f n → : is a smooth function, or subject to constraints on its variables, so it is called constrained optimization [9] .

Genetic Algorithms
Genetic algorithms are searches and optimization tools inspired by evolutionary processes.Their strength lies in their ability to evolve near-optimal solution to complex problems.
Genetic algorithms which are pioneered by John Holland about 40 years ago combine selection, crossover, and mutation operators with the goal of finding the best solution to a problem.Genetic algorithms search for this optimal solution until a specified termination criterion is met.A genetic algorithm creates an initial population (a collection of chromosomes), evaluates this population, and then evolves the population through multiple generations in the search for a good solution for the problem at hand, genetic algorithms follow the process which can be seen in Figure (1) [4] and [6].

Figure (1). Structure of a Simple Genetic Algorithm
Genetic algorithms which have been recently used in solving complicated scientific problems, some hybrid genetic algorithms are developed by combining genetic algorithms with heuristic search strategy based on gradient such as conjugate gradient [10].

Previous Works:
In 1998 Okamoto et al. proposed a hybrid genetic algorithm with incorporating the modified Powell method for non-linear numerical optimization [8].
In 2003 Chelouah and Patrick proposed a continuous new hybrid algorithm combining genetic algorithm and Nelder-Mead simplex algorithm for continuous multimodal optimization problems [3].
In 2006 Kumar and Debroy used the conjugate gradient method and a hybrid optimization scheme involving conjugate gradient method and genetic algorithm to calculate the weights in the neural network model.The hybrid optimization scheme helped in finding optimal weights through a global search [5].
In 2007 Tantar, et al. proposed a parallel hybrid genetic algorithm (GA) for solving the structure of the predictive problem.Conjugated gradient-based Hill Climbing local search is combined with the GA, in order to efficiently deal with the problem by using the computational grid [12].
In 2007 Tahk, et al. proposed a hybrid optimization algorithm which combines evolutionary algorithm and the gradient search technique for optimization with continuous parameters [11].
In 2008 Zhou-Shun et al. proposed an algorithm which combines the local searching ability of conjugate gradient method with global search ability of GA organically [14].
In 2009 Li, et al. proposed a hybrid genetic algorithm which combines the conjugate gradient method with genetic algorithm to improve the performance of genetic algorithm for cable forces optimization [7].
In 2010 Xie and Liu proposed a hybrid genetic algorithm for geo-physical inversion.According to the properties of the genetic algorithm and the conjugate gradient algorithm, the method has the attributes of the global-convergence of the genetic algorithm and the fast convergence of the conjugate gradient [13].

Conjugate Gradient Algorithm [1]:
Conjugate gradient (CG) methods represent an important class of unconstrained optimization algorithm.The main advantages of the CG methods are its low memory requirements, its convergence speed and its poses a quadratic termination property in which the method is able to locate the minimum of quadratic function in a known finite number of iterations.
A non-linear conjugate gradient method generates a sequence   k x , k is an integer number, 0  k .Starting from an initial point 0 x , the value of k x is calculated by the following equation: where, the positive step size 0  k  is obtained by a line search, and the directions k d are generated as: where, 0 0 , the value of k  is determined according to the algorithm of Conjugate Gradient (CG), and its known as a conjugate gradient parameter, , consider .which is the Euclidean norm and . The termination conditions for the conjugate gradient line search are often based on some version of the Wolfe conditions.The standard Wolfe conditions are as followsً ( ) where , where k  is defined by one of the following formulas: .
Step5: Compute the new search direction defined by: . Where, βk is the conjugancy coefficient.
then stop, otherwise set k=k+1and go back to step 3.

Steepest Descent (SD) Method:
The Cauchy's steepest descent method is one of the oldest and most widely known simplest method for solving unconstrained minimization problems The method of steepest descent is a fundamental first order method.One advantage of the steepest descent method is that it requires calculation of the first gradient, but not second derivatives.However, it can be excruciatingly slow on difficult problems.
The SD method is the simplest method for solving non-linear optimization problems.The SD method is a line search method that moves along at every step.

Three Proposed Hybrid Genetic Algorithms:
This section contains three new hybrid genetic algorithms with conjugate gradient methods :

First Proposed Hybrid Genetic Algorithm: 1-Create Initial Population:
The population consists of a number of individuals detected by the algorithm designer and according to the nature of the problem.In the first hybrid algorithm, the population has been detected by 12 individuals which are generated randomly.The chromosome consists of 12 number of values which are decimals.

2-Fitness Evaluation:
The fitness of a solution is a measure that can be used to compare solutions to determine which is better.The fitness functions in this work are the test functions that are defined in section 6.

3-Selection:
In this phase, we have to create a new population from the current generation.The selection operation determines which parent chromosomes participate in producing offspring for the next generation.The most common way is to set the selection probability equals to: f(x): fitness value of chromosome.i: chromosome number; t: population number.

4-Crossover Operator:
Crossover operator is usually applied to select pairs of parents.single-point crossover is the most basic crossover operator, where a crossover point is selected randomly, and two parent chromosomes are interchanged at this point.

5-Mutation:
The most common way of implementing mutation is the order changing mutation (which is used in this work), two positions are chosen randomly in the chromosome and exchange their values with a probability equals to a very low (which is equal to 0.08 in this work).
After mutation, a new population had been created.Each chromosome in the new population is an initial point for conjugate gradient method, then the hybridization with conjugate gradient start.The steps of conjugate gradient eq. ( 2) and ( 3) are executed to generate a new population to start genetic algorithm again until the stopping criterion is satisfied.

6-Stop Criterion:
The stopping criterion in the three hybrid genetic algorithms decides whether the algorithm continues in searching or stops.The stop criterion here depends on two approaches: generations number or minimum function value.Step5: If k=1 then d1=-g1, where g is the gradient vector else continue.

• Outlines of the
Step6: xk+1=xk+αdk where α is a step size which is equal to 0.001.Step7: Calculate fitness function for each chromosome in the population, then find minimum function value.Step8: If k=n or minimum function value is very small then stop; print Minimum function value, otherwise continue.Step9: dk+1 =-gk+1+βk*dk, where βk is FR formula as defined in eq.( 7) go to step 3.

Figure (3). Flow Chart for the Second Hybrid Algorithm
Three Proposed Hybrid Genetic Algorithms

Numerical Results and Discussion:
In order to assess the performance of the new three hybrid algorithms, some generalized selected well-known test functions, Powell function, Osp (Oren and Spdicato) function, and Diagonal4 function are used.
All programs are written in MATLAB version R2007a.the comparative performances for all these hybrid algorithms are evaluated by considering minimum function value or number of generations.
The initial population is randomly chosen in the first and second hybrid algorithms, so the first stopping criterion which is the number of generations is used to determine the best minimum for each test function.After determining the best minimum for each function, the second stopping criterion which is the best minimum is used to determine the number of generations to reach the best minimum.
Fifty trials are carried out for each test function in each hybrid algorithm and the following numerical results for each hybrid algorithm and for each test function are concluded out of these trials.
Comparison between three new hybrid genetic algorithm and genetic algorithm is shown in Table (1) with respect to the minimum value.In Table (1) HGA1, HGA2, HGA3 refers to first, second and third proposed hybrid genetic algorithm.

2-Osp (Oren and Spdicato) Function:
The first stop criterion is a number of generations which is equal to 5000 generation, the minimum 4 The second stop criterion is, when the minimum value for this function is: Using βk as it is defined in eq.( 7), the results to reach the minimum

3-Diagonal4 Function:
The first stop criterion is a number of generations which is equal to 5000 generation, the minimum values for this function out of 50 trials are: The second stop criterion is, when the minimum value for this function is: 12 1 min −   E Using βk as defined in eq.( 7), the results to reach the minimum ( ) and one trial reaches the minimum value ( 08 1 −  E ).They all reached these minimum value in 5000 generation.The results of the rest trials (35 trials) reached the minimum ( The minimized values vectors in these different dimensions plus the initial point for this test function (3,-1.0,1,…)are used to create the initial population (12 chromosome) for the third hybrid method.
After two iteration, the third hybrid method gave the minimum (7.540316246596e-009) for this test function.

2-Osp (Oren and Spdicato) Ffunction:
Conjugate gradient method with this test function had been executed in different dimensions.The minimized value for this function in all these dimensions is about (E-06).
The minimized values vectors in these different dimensions plus the initial point for this test function (1, …) are used to create the initial population (12 chromosome) for the third hybrid method.
After two iteration, the third hybrid method gave the minimum (4.836750314048e-024) for this test function 3-Diagonl4 Function: Conjugate gradient method with this test function had been executed in different dimensions.The minimized value for this function in all these dimensions is about (1.0E-07-1.0E-09).
The minimized values vectors in these different dimensions plus the initial point for this test function (1, 1, …) are used to create the initial population (12 chromosome) for the third hybrid method.
After one iteration, the third hybrid method gave the minimum (1.780114834221e-011) for this test function.
First Proposed Hybrid Algorithm Step1: Create initial population randomly, k=0, 1, 2, … , n. Step2: Calculate fitness function for each chromosome in the population.Step3: [New population] Create a new population by repeating the following steps until a new population is complete.• (Selection) Select two parent chromosomes from a population according to their fitness.• (Crossover) (recombination)] Crossover the parents to form a new offspring (children).• (Mutation) Mutate new offspring by using order change at random position.Step4: Let xk=new population, Set k=k+1.
50 trials are: Six trials out of 50 trials are failed.Two trials reach 13

Osp (Oren and Spdicato) Function:
The minimum values for this function in 5000 generations are the same minimum values as the first hybrid algorithm, in 50 trials, twenty-six trials reach the minimum

Numerical Results for the Third Proposed Hybrid Algorithm: 1-Powell Function:
Conjugate gradient method with this test function had been executed in 11 different dimensions.The minimized values for this function, in all these dimensions, are about ( (1.0E-06) -(1.0E-07)).

Table ( 1
).Comparison Between Three Proposed Hybrid Genetic Algorithms and Original Genetic Algorithm.