In differential evolution (DE) algorithm, depending on the characteristics of the problem at hand and the available computational resources, different strategies combined with a different set of parameters may be effective. In addition, a single, well-tuned combination of strategies and parameters may not guarantee optimal performance because different strategies combined with different parameter settings can be appropriate during different stages of the evolution. Therefore, various adaptive/self-adaptive techniques have been proposed to adapt the DE strategies and parameters during the course of evolution. In this paper, we propose a new parameter adaptation technique for DE based on ensemble approach and harmony search algorithm (HS). In the proposed method, an ensemble of parameters is randomly sampled which form the initial harmony memory. The parameter ensemble evolves during the course of the optimization process by HS algorithm. Each parameter combination in the harmony memory is evaluated by testing them on the DE population. The performance of the proposed adaptation method is evaluated using two recently proposed strategies (DE/current-to-
During the last decade, evolutionary algorithms (EAs) inspired by Darwinian theory of evolution are becoming increasingly popular because of their ability to handle nonlinear and complex optimization problems. Unlike, the conventional numerical optimization methods, EAs are population-based metaheuristic algorithms and require the objective function values, while properties such as differentiability and continuity are not necessary. However, EAs performance depends on the encoding schemes, evolutionary operators, and parameter settings such as population size, mutation scale factor, and crossover rate. In addition, an appropriate parameter selection is a problem dependent and requires a time-consuming trial-and-error parameter tuning process. The trail-and-error based parameter selection is ineffective if the optimization is required in an automated environment or if the user has no experience in the fine art of the control parameter tuning. To overcome this, different parameter adaptation schemes have been presented [
Differential evolution (DE) [
In [
Harmony search (HS) is also population-based metaheuristic optimization algorithm which mimics the music improvisation process. Recently, HS is gaining significance as an efficient optimization algorithm and is used in variety of applications. In HS, the generation of a new vector or solution is based on the consideration of all the existing vectors, rather than considering only two vectors as in DE (parent and mutant vector) [
During the past decade, hybridization of EAs has gained significance, due to ability to complement each other’s strengths and overcome the drawbacks of the individual algorithms. To enhance the exploitation ability in HS [
In this paper, we propose a DE parameter adaptation technique based on HS algorithm. In the proposed adaptation method, a group of DE control parameter combinations are randomly initialized. The randomly initialized DE parameter combinations form the initial harmony memory (HM) of the HS algorithm. Each combination of the parameters present in the HM is evaluated by testing on the DE population during the evolution. Based on the effectiveness of the DE parameter combinations present in HM, the HS algorithm evolves the parameter combinations. At any given point of time during the evolution of the DE population, the HM contains an ensemble of DE parameters that suits the evolution process of the DE population.
The rest of the paper is organized as follows. Section
Differential evolution (DE) is a simple real parameter optimization algorithm that belongs to the class of evolutionary algorithms (EAs) and involves the continuous application of operators such as mutation, crossover, and selection. DE starts with
The initial population is uniformly sampled from the search space constrained by the prescribed minimum and maximum parameter bounds
After initialization, corresponding to each target vector
“DE/rand/1” [
After the mutation, crossover operation is applied to each pair of the target vector
After crossover, the generated trial vectors are evaluated using the objective function and a selection operation is performed as shown in
As mentioned above, the mutation, crossover, and selection steps are repeated generation after generation until a termination criterion (reaching the maximum number of function evaluations set) is satisfied. The algorithmic description of the DE is summarized in Algorithm
parameter vectors. Set the generation number
Although, DE has attracted much attention recently as a global optimizer over continuous spaces [
To some extent, the guidelines are useful for selecting the individual parameters of DE. However, the performance of DE is more sensitive to the combination of the mutation strategy and its associated parameters. For a mutation strategy, [ SaDE [ jDE [ JADE [ EPSDE [ MDE-
Unlike most EAs, which simulate natural selection and biological evolution, HS is a population-based metaheuristic optimization algorithm which mimics the music improvisation process where musicians improvise their instruments’ pitch by searching for a perfect state of harmony. Some of the characteristics of HS that distinguish it from other metaheuristics such as DE are as follows [
In HS the improvisation operators, memory consideration, pitch adjustment, and random consideration play a major role in achieving the desired balance between the exploitation and exploration during the optimization process [
Recently, HS algorithm garnered a lot of attention from the research community and is successfully applied in solving many optimization problems in engineering and computer science. Consequently, the interest in HS has led to the improvement and development of its performance in line with the requirements of problems that are solved. The improvements proposed by different researchers related to HS can be categorized as follows [
As highlighted in the previous section, depending on the nature of problem (unimodal or multimodal) and available computation resources, different optimization problems require different mutation and crossover strategies combined with different parameter values to obtain optimal performance. In addition, to solve a specific problem, different mutation and crossover strategies with different parameter settings may be better during different stages of the evolution than a single set of strategies with unique parameter settings as in the conventional DE. Motivated by these observations, the authors in [
In EPSDE, each member in the DE population is randomly assigned a mutation and crossover strategies and the associated parameter values taken from the respective pools. The population members (target vectors) produce offspring (trial vectors) using the assigned strategies and parameter values. If the generated trial vector is able to enter the next generation of the evolution, then combination of the strategies and the parameter values that produced trail vector are stored. If trial vector fails to enter the next generation, then the strategies and parameter values associated with that target vector are randomly reinitialized from the respective pools or from the successful combinations stored with equal probability.
To have an optimal performance based on the ensemble approach, the candidate pool of strategies and parameters should be restrictive to avoid the unfavorable influences of less effective strategies and parameters [
In EPSDE, since the strategy and parameter pools are restrictive, most of the individuals in the pools may become obsolete during the course of the evolution of DE population. Therefore, it would be apt if the strategy and the parameter pools can evolve with the DE population. Based on this motivation, we propose an HS based parameter ensemble adaptation for DE (HSPEADE). The overall view of the proposed HSPEADE is presented in Algorithm
As shown in Algorithm
To obtain optimal performance based on the ensemble approach, it is obvious that the parameter combinations in HM should be diverse during initial generations of the DE population evolution and should converge to the optimal combination towards the end of the evolution. During the course of the experimentation, we observed that HS is more suitable to evolve the parameter ensemble due to its characteristics such as the following: (1) HS generates a single vector every generation and replaces the worst performing vector; (2) it can randomly generate new solution vectors thus enabling diversity if needed and (3) it considers all the solution vectors in the memory to generate a new solution.
In HSPEADE, to facilitate the diversity in parameter ensemble in the initial stages and to allow the HS to converge to the optimal combination, we made some modifications in the HS algorithm. In HS algorithm shown in Algorithm
In this section, we evaluate the performance of the proposed parameter adaptation technique for DE. The details regarding the test problems, experimental environment, and algorithms used for comparison are given below.
The performance of the proposed method is evaluated using a selected set of standard test functions from the special session on real-parameter optimization of the IEEE Congress on Evolutionary Computation (CEC 2005) [
The proposed HSPEADE being a general idea can be applied with any frame work. In this work, the experiments are designed as follows. We consider a single crossover strategy which is binomial crossover. We selected binomial crossover because the two recent adaptive DE algorithms (JADE [ We consider two mutation strategies “DE/current-to-
The algorithmic comparison is divided into two sets as follows. SET 1 uses the “DE/current-to-
JADE: “DE/current-to- EPDE1: “DE/current-to- HSPEADE1: “DE/current-to-
MDE: “DE/current-to-gr_best” strategy, binomial crossover strategy, EPDE2: “DE/current-to-gr_best” strategy, binomial crossover strategy, HEPEADE2: “DE/current-to-gr
To compare the performance of different algorithms, we employ two types of statistical tests, namely,
The experimental results (mean and standard deviations) corresponding to algorithms JADE, EPDE1, and HSEPDE1 (SET 1) on 30-, 50-, and 100-dimensional problems are presented in Tables
Performance of JADE, EPDE1, and HSEPDE1 on 30D problems.
JADE | EPDE1 | HSPEADE1 | ||||
---|---|---|---|---|---|---|
Mean | Std. | Mean | Std. | Mean | Std. | |
1 | 0 | 0 | 0 | 0 | 0 | 0 |
2 |
|
|
|
|
|
|
3 |
|
|
|
|
|
|
4 |
|
|
|
|
|
|
5 |
|
|
|
|
|
|
6 |
|
|
|
|
|
|
7 |
|
|
|
|
|
|
8 |
|
|
|
|
|
|
9 |
|
|
|
|
|
|
10 |
|
|
|
|
|
|
11 |
|
|
|
|
|
|
12 |
|
|
|
|
|
|
13 |
|
|
|
|
|
|
14 |
|
|
|
|
|
|
Performance of JADE, EPDE1, and HSEPDE1 on 50D problems.
JADE | EPDE1 | HSPEADE1 | ||||
---|---|---|---|---|---|---|
Mean | Std. | Mean | Std. | Mean | Std. | |
1 |
|
|
|
|
|
|
2 |
|
|
|
|
|
|
3 |
|
|
|
|
|
|
4 |
|
|
|
|
|
|
5 |
|
|
|
|
|
|
6 |
|
|
|
|
|
|
7 |
|
|
|
|
|
|
8 |
|
|
|
|
|
|
9 |
|
|
|
|
|
|
10 |
|
|
|
|
|
|
11 |
|
|
|
|
|
|
12 |
|
|
|
|
|
|
13 |
|
|
|
|
|
|
14 |
|
|
|
|
|
|
Performance of JADE, EPDE1, and HSEPDE1 on 100D problems.
JADE | EPDE1 | HSPEADE1 | ||||
---|---|---|---|---|---|---|
Mean | Std. | Mean | Std. | Mean | Std. | |
1 |
|
|
|
|
|
|
2 |
|
|
|
|
|
|
3 |
|
|
|
|
|
|
4 |
|
|
|
|
|
|
5 |
|
|
|
|
|
|
6 |
|
|
|
|
|
|
7 |
|
|
|
|
|
|
8 |
|
|
|
|
|
|
9 |
|
|
|
|
|
|
10 |
|
|
|
|
|
|
11 |
|
|
|
|
|
|
12 |
|
|
|
|
|
|
13 |
|
|
|
|
|
|
14 |
|
|
|
|
|
|
Performance of MDE, EPDE2, and HSEPDE2 on 30D problems.
MDE | EPDE2 | HSPEADE2 | ||||
---|---|---|---|---|---|---|
Mean | Std. | Mean | Std. | Mean | Std. | |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Performance of MDE, EPDE2, and HSEPDE2 on 50D problems.
MDE | EPDE2 | HSPEADE2 | ||||
---|---|---|---|---|---|---|
Mean | Std. | Mean | Std. | Mean | Std. | |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
10 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Performance of MDE, EPDE2, and HSEPDE2 on 100D problems.
MDE | EPDE2 | HSPEADE2 | ||||
---|---|---|---|---|---|---|
Mean | Std. | Mean | Std. | Mean | Std. | |
1 |
|
|
|
|
|
|
2 |
|
|
|
|
|
|
3 |
|
|
|
|
|
|
4 |
|
|
|
|
|
|
5 |
|
|
|
|
|
|
6 |
|
|
|
|
|
|
7 |
|
|
|
|
|
|
8 |
|
|
|
|
|
|
9 |
|
|
|
|
|
|
10 |
|
|
|
|
|
|
11 |
|
|
|
|
|
|
12 |
|
|
|
|
|
|
13 |
|
|
|
|
|
|
14 |
|
|
|
|
|
|
The
Statistical test results to compare the performance of JADE, EPDE1, and HSEPDE1.
JADE versus EPDE1 | JADE versus HSPEADE1 | EPDE1 versus HSPEADE1 | |||||||
---|---|---|---|---|---|---|---|---|---|
30D | 50D | 100D | 30D | 50D | 100D | 30D | 50D | 100D | |
1 | 0 | +1 | +1 | 0 | +1 | +1 | 0 | 0 | 0 |
2 | −1 | +1 | +1 | 0 | +1 | +1 | +1 | +1 | +1 |
3 | −1 | −1 | +1 | −1 | −1 | +1 | +1 | +1 | +1 |
4 | +1 | 0 | 0 | +1 | +1 | +1 | 0 | +1 | +1 |
5 | +1 | 0 | +1 | +1 | +1 | +1 | 0 | +1 | +1 |
6 | +1 | −1 | +1 | +1 | +1 | +1 | +1 | +1 | +1 |
7 | 0 | 0 | 0 | +1 | +1 | +1 | +1 | +1 | 0 |
8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
9 | −1 | −1 | −1 | 0 | 0 | 0 | +1 | +1 | +1 |
10 | −1 | −1 | −1 | 0 | +1 | +1 | +1 | +1 | +1 |
11 | −1 | −1 | −1 | 0 | +1 | 0 | +1 | +1 | +1 |
12 | −1 | +1 | +1 | 0 | +1 | +1 | +1 | 0 | +1 |
13 | −1 | +1 | −1 | +1 | +1 | +1 | +1 | +1 | +1 |
14 | −1 | 0 | −1 | +1 | +1 | 0 | +1 | +1 | 0 |
Wilcoxon test | −1 | −1 | +1 | +1 | +1 | +1 | +1 | +1 | +1 |
Statistical test results to compare the performance of MDE, EPDE2, and HSEPDE2.
MDE versus EPDE2 | MDE versus HSPEADE2 | EPDE2 versus HSPEADE2 | |||||||
---|---|---|---|---|---|---|---|---|---|
30D | 50D | 100D | 30D | 50D | 100D | 30D | 50D | 100D | |
1 | 0 | 0 | −1 | 0 | 0 | 0 | 0 | 0 | +1 |
2 | 0 | −1 | −1 | +1 | +1 | +1 | +1 | +1 | +1 |
3 | −1 | +1 | +1 | 0 | 0 | +1 | +1 | −1 | 0 |
4 | +1 | 0 | 0 | 0 | +1 | +1 | 0 | +1 | +1 |
5 | +1 | +1 | +1 | +1 | +1 | +1 | +1 | +1 | +1 |
6 | +1 | +1 | +1 | +1 | +1 | +1 | +1 | +1 | +1 |
7 | +1 | 0 | −1 | +1 | 0 | 0 | +1 | 0 | 0 |
8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
9 | −1 | −1 | −1 | 0 | 0 | +1 | +1 | +1 | +1 |
10 | −1 | −1 | −1 | 0 | 0 | +1 | +1 | +1 | +1 |
11 | −1 | −1 | −1 | 0 | 0 | +1 | +1 | +1 | +1 |
12 | −1 | 0 | 0 | 0 | 0 | 0 | +1 | 0 | 0 |
13 | −1 | −1 | −1 | 0 | 0 | +1 | +1 | +1 | +1 |
14 | −1 | −1 | −1 | 0 | 0 | −1 | +1 | +1 | +1 |
Wilcoxon test | 0 | 0 | 0 | +1 | +1 | +1 | +1 | +1 | +1 |
From the experimental results, it can be observed that JADE performs better than EPDE1 on 30-dimensional versions of the problems. However, as the dimensionality of the test problems increases, the performance of the EPDE1 becomes better compared to JADE algorithm. The improved performance of EPDE1 can be attributed to the ensemble approach as different combinations of strategies and parameters can be effective during different stages of the evolution process [
From Tables
From the results, it is clear that the performance of HSPEADE1 is always better than or equal to EPDE1. This confirms our assumption that evolving parameter ensemble is better than fixed combination of parameter ensemble.
From the experimental results in Tables
From the Wilcoxon rank sum test results (bottom row of Tables
Performance comparison of JADE and HSPEADE1.
Performance comparison of EPDE1 and HSPEADE1.
Performance comparison of MDE and HSPEADE2.
Performance comparison of EPDE2 and HSPEADE2.
To improve the performance of DE, different adaptation techniques have been proposed. In this paper, we propose a new parameter adaptation technique for DE based on ensemble approach and HS algorithm and is referred to as HSPEADE. In HSPEADE, an ensemble of parameters is randomly sampled and forms the initial harmony memory. The parameter ensemble evolves during the course of the optimization process by HS algorithm. Each parameter combination in the harmony memory is evaluated by testing them on the DE population. During the initial stages of the evolution the DE parameter combinations in the harmony memory of HS are diverse and facilitate exploration for the better parameter combination. However, during the end of the evolution process fine tuning of the parameter combinations occurs and facilitates exploitation.
The performance of HSPEADE is evaluated by using two recently proposed DE strategies (DE/current-to-
In the present work, we only consider the evolution of the parameter ensemble using the HS framework. As a future work, we would like to incorporate the ensemble of mutation and crossover strategies into the HS framework.
This research was supported by Kyungpook National University Research Fund, 2012.