Composite Differential Search Algorithm

. Differentialsearchalgorithm(DS)isarelativelynewevolutionaryalgorithminspiredbytheBrownian-likerandom-walkmovement whichisusedbyanorganismtomigrate.IthasbeenverifiedtobemoreeffectivethanABC,JDE,JADE,SADE,EPSDE,GSA, PSO2011,andCMA-ES.Inthispaper,weproposefourimprovedsolutionsearchalgorithms,namely“DS/rand/1,”“DS/rand/2,” “DS/currenttorand/1,”and“DS/currenttorand/2”tosearchthenewspaceandenhancetheconvergenceratefortheglobal optimizationproblem.Inordertoverifytheperformanceofdifferentsolutionsearchmethods,23benchmarkfunctionsare employed.Experimentalresultsindicatethattheproposedalgorithmperformsbetterthan,oratleastcomparableto,theoriginal algorithmwhenconsideringthequalityofthesolutionobtained.However,theseschemescannotstillachievethebestsolution forallfunctions.Inordertofurtherenhancetheconvergencerateandthediversityofthealgorithm,acompositedifferential searchalgorithm(CDS)isproposedinthispaper.Thisnewalgorithmcombinesthreenewproposedsearchschemesincluding “DS/rand/1,”“DS/rand/2,”and“DS/currenttorand/1”withthreecontrolparametersusingarandommethodtogeneratethe offspring.ExperimentresultsshowthatCDShasafasterconvergencerateandbettersearchabilitybasedonthe23benchmark functions.

Recently, differential search algorithm (DS) developed by Civicioglu [7] is a population-based heuristic evolutionary algorithm inspired by the Brownian-like random-walk movement which is used by an organism to migrate.This algorithm has been used to find the optimal solution in numerous practical navigational, geodetic, and astro-geodetic problems.In the paper [7], the statistical tests realized for the comparison of performances indicate that the problem-solving success of DS algorithm in global optimization problem is better than the success of the algorithms ABC [14], JDE [15], JADE [16], SADE [17], EPSDE [18], GSA [19], PSO2011 [20], and CMA-ES [21] used in this paper.However, there are still some limitations in this algorithm.It is good at exploring the search space and locating the region of global minimum, but it is slow at exploitation of the solution.Therefore, its convergence rate is also a problem in some cases.Accelerating the convergence rate and enhancing the exploitation ability of the algorithm have become two important problems and goals in the algorithm research.However, this field of study is still in its early days and a large number of future researches are necessary in order to develop the effective algorithm for optimization problems.Particularly, within our knowledge, there is almost no paper concerning an improved heuristic method for the DS algorithm.
In this paper, inspired by the mutation operation of the DE algorithm, we propose four improved solution search schemes to search the new space and enhance the convergence rate of the original algorithm.However, in some cases, these four improved solution search schemes are trapped in local optimal solutions and they cannot find the best solutions.In order to balance the exploration and exploitation of the original algorithm, this paper proposes a high-efficiency composite DS algorithm (CDS).The new algorithm combines three new proposed search schemes with three control parameters in a random method to generate the offspring.Experiments have been conducted on 23 benchmark functions chosen from previous literatures.Experimental results indicate that our approach is effective and efficient.Compared with different search schemes, CDS performs better, or at least comparably, in terms of the quality of the final solutions and the convergence rate.
The rest of this paper is organized as follows.In Section 2 we will review the basic DS.The proposed method is reviewed in Section 3, respectively.Benchmark problems and corresponding experimental results are given in Section 4. In the last section we conclude this paper and point out some future research directions.

Differential Search Algorithm
Differential search algorithm (DS) developed by Civicioglu [7] is one of the most superior evolutionary algorithms.The differential search algorithm is inspired by migration of living beings which constitute superorganisms during climate change of the year.In DS algorithm, the search space is simulated as the food areas and each point in the search space corresponds to an artificial-superorganism migration.The goal of this migration is to find the global optimal solution of the problem.During this process, the artificialsuperorganism checks which randomly selected positions can be retained temporarily.If such a tested position is suitable to be retained for some time, the artificial-superorganism uses this migration model to settle at the discovered position and then continues its migration from this position on.Main steps of the DS algorithm are listed below.
After initialization, stopover vectors  , at the areas are generated between the artificial-organisms that can be described by a Brownian-like random walk model.In order to calculate the stopover vectors, the algorithm creates a stopover vector corresponding to each population individual or target vector in the current population.The method for producing the stopover vectors can be described as follows: where  1 ∈ [1, . . ., NP] are randomly chosen integers, and  1 ̸ = .Scale controlled the size of change in the positions of the individuals of the artificial-organisms.Note that the value of scale is generated by a gamma random number generator controlled by a uniform distribution random number between 0 and 1.The search process of stopover site can be calculated by the individuals of the artificial organisms of the superorganism.This process can be described as follows: where  = [1, . . ., ];  , is an integer number either 1 or 0;   ,, denotes the trail vector of the th particle in the th dimension at the th iteration.
Selection operation is used to choose the next population (i.e.,  =  + 1) between the stopover site population and the artificial-organism population.The selection operation is described as The standard differential search algorithm can be described as in Procedure 1.

Improved Approach
(1) begin (2) Set the generation counter  = 0; and randomly initialize a population of NP *  individuals   .Initialize the parameter 1, 2 (3) Evaluate the fitness for each individual in .(4) while stopping criteria is not satisfied do (5) scale = randg(2 * rand) * (rand-rand) (6) for  = 1 to NP do (7) select randomly  ̸ =  (8)   =   + scale × (  −   ) (9) end (10)  = rand (NP, ); (11) If rand < rand then (12) If rand < 1 then (13) for  = 1 to NP do (14) (,:) = (,:) < rand ( 15) end (16) else (17) for  = 1 to NP do (18) (, randi()) = 0 (19) end (20) end (21) else (22) for  = 1 to NP do (23)  = randi(, "DS/current-to-rand/2" where scale controls the size of change in the positions of the individuals of the artificial-organisms. Similar to DE, four mutation schemes are proposed in this paper.The search methods "DS/rand/1" and "DS/rand/2" are two strategies which bear stronger exploration capabilities that can effectively maintain population diversity.Compared with other strategies, the search schemes "DS/current to rand/1" and "DS/current to rand/2" benefit from their fast convergence by guiding the evolutionary search with the random target.However, these two new strategies may lose their diversity and their global exploration abilities.Compared with the "DS/original/1, " we can find the advantages of these four strategies."DS/rand/1" and "DS/rand/2" are random enough for exploration."DS/current to rand/1" and "DS/current to rand/2" can guide the search to a random direction.In the experiment section, we will use different functions to test these five schemes so that we can show the effective and efficient of these strategies.

Composite DS.
For successful application to optimization problems, a population-based optimization should not only find the global optimization solution but also have a faster convergence speed.Based on the experiment results of these five search schemes, we find the effectiveness of differential search algorithm in solving global numerical problem that depends on selected search schemes and its parameters.However, some different problems need different search schemes and different parameter values according to their problems.From the experiment results in Section 4, we can find that five search schemes show different advantages in various directions such as diversity and convergence rate and so on.
In order to obtain these goals and combine the advantages of these different schemes, a composite differential search algorithm (CDS) is proposed in this paper, which is used to randomly combine several search schemes and some relative parameters to produce the new offspring.The flowchart of the CDS algorithm is shown in Figure 1.In this paper, we use three search schemes and three control parameters to consist [−5.12, 5.12] 0

Experimental Results
To evaluate the performance of our algorithm, we applied it to 23 standards benchmark functions in [23].These functions have been widely used in the literature.Since we do not make any modification of these functions, they are given in Table 1.The first seven functions are unimodal functions.The  06 is the step function which has one minimum and is discontinuous.Function  07 is a noisy quadratic function.
The following seven functions are multimodal test functions.For these functions, the number of local minima increases exotically with the problem dimensions.Then, ten multimodal test functions with fixed dimension which have only a few local search minima are used in our experimental study.Tables 1 and 2 have shown the details of these functions.So far, these problems have been widely used as benchmarks for study with different methods by many researchers.The algorithm is coded in MATLAB 7.9 and experiments are made on a Pentium 3.0 GHz processor with 4.0 GB of memory.
In this experiment, we set the number of particles to be 100, and we set the p1 and p2 to be 0.3 * rand.In this strategy, all vectors for the update rule are selected from the population at random and, then, it has no bias to any special search directions and it chooses new search 398  directions in a random manner.The maximum number of fitness function evaluations is 100000, 300000, and 500000 for  1 - 13 with 10, 30, and 50 dimensions, respectively, and is 10000 for 14 - 23 .For all test functions, the algorithms carry out 30 independent runs each starting from a random population with different random seeds.Error (log)    08 - 13 , the DS/rand/1 and DS/rand/2 can also find the optimal solution on these complex functions.DS/current to rand/1 and DS/current to rand/2 can provide closer to optimal solution on multimodal optimization functions,; however, they perform a little worse than DS/rand/1 and DS/rand/2.For 50 problems, the experiment results are shown in Table 5; as is shown in

Sensitivities to Population Size.
Performance of DS is always sensitive to the selected population size.If the population is too small, the diversity of possible movements is poor and then the algorithm may be easily trapped in a local optimum.On the other hand, if the population size is too large, DS exhausts the fitness evaluations very quickly without being able to locate the optimum.Therefore, the choice of the best population size of DS is always critical for different problems.
To investigate the sensitivity of the proposed algorithm to variations of population size, some experiments are repeated   for NP = 50 and NP = 150.The experimental results are given in Tables 6 and 7 for five search schemes at dimension  = 30.For NP = 50, the performances of DS/rand/1 and DS/rand/2 are significantly superior to that of other algorithms according to the experimental results shown in  Error (log)  than DS/current to rand/1 and DS/current to rand/2 on these functions.For NP = 150 in Table 7, DS/rand/1 and DS/rand/2 are able to obtain a significantly better performance than other schemes on 11 functions.

Comparison of CDS with Enhanced Differential Search
Algorithm.The performances of CDS are compared with those of original DS and DS with "DS/rand/1".In CDS, the population size is 40.The maximum number of fitness  However, for the  05 , the "DS/rand/1" can obtain a better solution than other algorithms.Therefore, it is concluded that CDS is more effective than DS and "DS/rand/1" for highdimensional classical benchmark functions.In particular, CDS exhibits an overall higher convergence speed and better robustness than the two competitors under some conditions.We also can conclude that the combination operator of these methods has the ability to accelerate DS, especially for the higher dimensionality.In addition, the graphs of  show that CDS has improved the convergence characteristics of the original algorithm, regardless to dimension.

Comparison of CDS with Enhanced Differential Search
Algorithm in Fixed Dimension.In this section, we will compare our algorithm with enhanced differential search algorithm for fixed functions.The experimental results are listed in Table 9.As can be seen in Table 9, for  14 ,  16 , and  17 , with only a few local minima, the dimension of the function is also small.In this case, it is hard to judge the performances of individual algorithms.All algorithms were able to find optimal solutions for these two functions.For  15 ,  18 ,  19 , and  20 , the CDS can provide better solutions than DS and "DS/rand/1".For  21 - 23 , the CDS can provide all the better solution.The algorithm performs superiorly better than DS and DS with "DS/rand/1".The graphs of Figure 13 show that the convergence progresses of different search schemes and CDS for  15 .

Conclusions
In this paper, we propose four different search schemes.Although these new schemes could not find better solutions than the original algorithm for only a few functions, these new schemes could have a faster convergence rate and better diversity than the original algorithm.In order to further enhance the exploitation of the algorithm, we combined three new schemes with three control parameters in a random method to consist a new algorithm (CDS).To verify the performance of CDS, 23 benchmark functions chosen from literature are employed.The results show that the proposed CDS algorithm clearly outperforms the basic DS and the new proposed schemes.In this paper, we only consider the global optimization.The algorithm can be extended to solve other problems such as constrained optimization problems.

Figure 1 :
Figure 1: Flowchart of the CDS algorithm.

Figure 2 :
Figure 2: Comparison of performance of six algorithms for minimization of  01 with dimension 30.

Figure 3 :
Figure 3: Comparison of performance of six algorithms for minimization of  02 with dimension 30.

Figure 4 :
Figure 4: Comparison of performance of six algorithms for minimization of  04 with dimension 30.

Figure 7 :
Figure 7: Comparison of performance of six algorithms for minimization of  08 with dimension 30.

Figure 8 :
Figure 8: Comparison of performance of six algorithms for minimization of  09 with dimension 30.

Figure 9 :
Figure 9: Comparison of performance of six algorithms for minimization of  10 with dimension 30.

Figure 10 :
Figure 10: Comparison of performance of six algorithms for minimization of  11 with dimension 30.

Figure 11 :
Figure 11: Comparison of performance of six algorithms for minimization of  12 with dimension 30.

Figure 12 :
Figure 12: Comparison of performance of six algorithms for minimization of  13 with dimension 30.

Table 1 :
Benchmark functions based on our experimental study for high-dimensional.

Table 2 :
Benchmark functions based on our experimental study for fixed function.  −   ) 6 ] Some repre3,4,5,6,7,8,9,10,11,12,and 13are shown inFigures 2,3,4,5,6,7,8,9,10,11,12,and 13.As can be seen in Table3, for the 10 problem, it is interesting to note that DS/rand/1 outperforms DS/original/1 on thirteen functions ( 01 - 13 ).The DS/rand/1 can find the global optimization value on 6 functions ( 06 ,  08 ,  09 ,  10 ,  11 ,  12 , and  13 ).On three functions ( 01 ,  02 , and  07 ), the DS/rand/1 can find the nearest global optimization solution.For the rest of the problems, the DS/rand/1 cannot find the best solutions within the maximum function evaluation.Compared with the DS/original/1, DS/rand/2 can give a better solution on all functions.This scheme also can find the global optimization value on six functions ( 06 ,  08 ,  09 ,  10 ,  11 ,  12 , and  13 ).But this method cannot beat the DS/rand/1.For  05 , the DS/rand/2 search scheme can provide a better solution than the DS/rand/1 method.For  01 ,  02 ,  03 ,  04 , and  06 , the DS/rand/1 search scheme can own better search performance than DS/rand/2.For the DS/current to rand/1 and the DS/current to rand/2 schemes, these two schemes outperform other search schemes.The experiment results are shown in Table4for 30; as can be seen in Table4, DS/rand/1 can provide the highest accuracy on functions  01 ,  02 ,  04 , and  05 .For Search Schemes.To investigate the performance of the different search schemes employed on the effectiveness of the differential search algorithm, five search schemes are proposed in the original DS.Four schemes, namely, DS/original/1, DS/rand/1, DS/rand/2, DS/current to rand/1, and DS/current to rand/2 are used in our experiments.These functions were studied at  = 10,  = 30, and  = 50.03 , the DS/current to rand/2 can obtain better solutions.For  06 , all search schemes can find the optimal solution.The search method DS/current to rand/1 can perform better on function  07 .For multimodal functions

Table 3 :
Comparisons of different search schemes for 10D.

Table 4 :
Comparisons of different schemes for 30.

Table 5
Figure 6: Comparison of performance of six algorithms for minimization of  07 with dimension 30.but they are a little far from the global optimums.For the  04 , DS/rand/2 has a better solution.For multimodal functions  08 - 13 with many local minima, the final results are more important because this function can reflect the algorithm's , while solving the unimodial optimization problem, DS/rand/1 can give a better solution than other schemes for functions  01 ,  02 , and  05 .For  03 and  07 , DS/current to rand/2 outperforms the other algorithms, ability to escape from poor local optima and obtain the near-global optimum.The DS/rand/1 and DS/rand/2 provide better solutions than other algorithms except for  09 .As can be seen in Tables3-5, the results show that DS/rand/1 and DS/rand/2 perform much better in most cases than other schemes.

Table 6 ,
since the DS/rand/1 and DS/rand/2 are better than other algorithms except for the  03 and  07 .For  12 and  13 , all algorithms can locate the near-global optimum over all 50 runs.When the population increases to NP = 100, DS/rand/1 and DS/rand/2 can obtain values higher than NP = 50.We can find that DS/rand/1 and DS/rand/2 are faster

Table 5 :
Comparisons of different schemes for 50.

Table 6 :
Comparisons of different schemes with population size 50.

Table 7 :
Comparisons of different schemes with population size 150.

Table 8 :
Continued.Figure 13: Comparison of performance of six algorithms for minimization of  15 with fixed dimension.functionevaluations is 100000, 300000, and 500000 for  1 - 13 with 10, 30, and 50 dimensions, respectively.The parameters 1 and 2 are set to be 0.3 * rand.CDS can inherit the bright sides of the three search schemes.The mean and standard deviation results of CDS with other algorithms are shown in Table8.As can be seen in Table8, the results of CDS can obtain much better results than original DS and DS with  05 .For the function  05 , the "DS/rand/1" can provide better solutions than original DS and CDS.For 50 problems, both CDS and "DS/rand/1" could search the optimal solution on some functions ( 06 ,  08 ,  09 ,  11 ,  12 , and  13 ).In addition, CDS have much better performances than DS and DS with "DS/rand/1" on the functions  01 ,  02 ,  03 ,  04 , and  07 .
"DS/rand/1" for all benchmarks with 10.There is no dispute that a more precise exploitation can enhance the performance of the algorithms.For 30 problem, CDS owns a very fast convergence rate and can give a better solution than original DS and DS with "DS/rand/1" for 12 functions, except for

Table 9 :
The mean and convergence iteration of the functions in Table2.