In evolutionary algorithms, genetic operators iteratively generate new offspring which constitute a potentially valuable set of search history. To boost the performance of offspring generation in the real-coded genetic algorithm (RCGA), in this paper, we propose to exploit the search history cached so far in an online style during the iteration. Specifically, survivor individuals over the past few generations are collected and stored in the archive to form the search history. We introduce a simple yet effective crossover model driven by the search history (abbreviated as SHX). In particular, the search history is clustered, and each cluster is assigned a score for SHX. In essence, the proposed SHX is a data-driven method which exploits the search history to perform offspring selection after the offspring generation. Since no additional fitness evaluations are needed, SHX is favorable for the tasks with limited budget or expensive fitness evaluations. We experimentally verify the effectiveness of SHX over 15 benchmark functions. Quantitative results show that our SHX can significantly enhance the performance of RCGA, in terms of both accuracy and convergence speed. Also, the induced additional runtime is negligible compared to the total processing time.
Japan Society for the Promotion of ScienceJP18K178231. Introduction
Evolutionary algorithms (EAs) have been shown to be generic and effective to search for global optima in the complex search space theoretically [1–3] and practically [4–6]. The exploration process of EAs imitates the natural selection process, which is realized by conducting the offspring generation and survivor individual selection alternately and iteratively. The population quality is gradually improved throughout the exploration process, which can be viewed as a stochastic population-based generation-and-test process. Because of the offspring generation, a large number of candidate solutions (i.e., individuals) are sampled, accompanied by corresponding fitness values, genetic information, and genealogy information. Such accumulated search data constitute search history which can be very informative and valuable for boosting the overall performance. For instance, exploiting search history can be useful for improving the search procedure under a limited budget of fitness evaluations (FEs). That is, no additional FEs are allowed for improving the search performance. Also, the computational cost of a single FE can be high when the fitness functions are complicated. To enable a better solution for the population without increasing the number of FEs, the way of exploiting the search history truly matters. Nevertheless, search history has been sparsely exploited and studied in existing methods.
Real-coded genetic algorithm (RCGA) has been widely studied in the past decades [7–11], and the main efforts for improving the performance of RCGA have been focused on the development of the crossover techniques [12]. Because the crossover operator is to generate new offspring from the current population, the quality of the new solutions straightforwardly affects the evolution direction and convergence speed. Given different mechanisms, crossover methods can differ from (1) parent selection, (2) offspring generation, and (3) offspring selection. Both parent and offspring can be more than two, depending on the design. The abovementioned three aspects associate the exploration ability with exploitation ability, and the degree and balance between both abilities affect the performance largely [13]. Although the self-adaptive feature of RCGA [14] can adjust the relationship to a certain extent, the “best” degrees and balance between exploration and exploitation for achieving a satisfactory solution can differ greatly with respect to different problem settings and can be hardly achieved with the adaptive feature.
With a large amount of search history data up to the current generation in hand, we attempt to introduce a crossover method that effectively exploits the history data in this paper. At first, an archive is defined to collect the survivor individuals over generations as the search history. Then, the stored individuals are clustered by k-means [15], and each cluster is assigned a score depending on the number of belonging individuals. At last, offspring is generated and selected according to the scores. We introduce two different schemes to update the archive. The proposed crossover operator, named search history-driven crossover (SHX), generates offspring by considering the cluster scores. Since SHX enables an offspring selection mechanism, any existing parent selection and offspring generation mechanisms can be easily integrated with it. To our knowledge, this is the first work to design the crossover model by effectively exploiting search history. We present a set of experiments to systemically evaluate the effectiveness of the proposed method using 15 benchmark functions. Three conventional crossover operators are employed, and the results with/without SHX are compared. Apart from the above, two archive update methods are also analyzed.
The main technical contributions of this paper are threefold. First, we propose a novel crossover model by effectively exploiting the search history. Second, we introduce the offspring selection based on the clusters calculated from the search history. Third, we introduce two schemes to update the survivor archive. A preliminary version of this paper appears in GECCO2020 [16].
2. Related Work
Crossover is one of the principal operators for generating offspring and deeply relates to the performance of the real-coded genetic algorithm (RCGA). Blend-α crossover (BLX-α) [17] proposed by Eshelman and Schaffer is one of the most popular operators. Offspring genes are independently and uniformly sampled within an interval between a gene pair of parents. The parameter α corresponds to the extension of the sampling interval, which plays a key role in maintaining the diversity of offspring. Eshelman et al. proposed Blend-α-β crossover (BLX-α-β) [18] which involves two extension parameters. Deb and Agrawal introduced simulated binary crossover (SBX) [19] which simulates the single-point crossover in binary-coded GA for continuous search space. The interval used in SBX is determined by a polynomial probability distribution β depending on the distribution index η. η indirectly adjusts the tendency of offspring generation. The above crossover operators have a common feature that the offspring genes are extracted according to a certain probability distribution from the predefined interval on the parent genes. This feature enables better results than using crossover operators for binary coding in the continuous search space. On the other hand, some crossover operators set more than two individuals as parents, which aim to generate offspring with well-preserved population statistics. In the case of unimodal normal distribution crossover (UNDX) [20], the generation of offspring follows a unimodal normal distribution defined on the line connecting two of the three parents. For simplex crossover (SPX) [21], D+1 individuals are taken as parents in the D-dimensional search space. SPX uniformly generates offspring within D-dimensional simplex constructed by parent individuals and expanded by a parameter ε.
Search history has also been exploited in some research, but to the best of our knowledge, none of them is for the purpose of improving the crossover model. Since online real systems often provide uncertain evaluation values which lead to unreliable convergence of GA, Sano and Kita proposed memory-based fitness estimation GA (MFEGA) [22]. MFEGA estimates the fitness from neighboring individuals stored in the search history. Leveraging search history allows estimation without requiring additional evaluation. Amor and Rettinger proposed GA using self-organizing maps (GASOM) [23]. SOM (self-organizing maps) can provide a visualized search history, which makes the regions explored intuitive for users. Moreover, individual novelty is introduced by the activation frequency in the search history table and utilized by the reseeding operator to preserve the exploration power. Yuen and Chow presented the continuous nonrevisiting GA (cNrGA) [24]. A binary partitioning tree called a density tree stores all evaluated individuals and divides the search space into nonoverlapped partitions by means of distributions. These subregions are used to check whether a new individual needs to be evaluated or not.
3. Overview
Principles of designing good crossover operators for RCGA are discussed in [25]. Two among them are especially important: (1) the crossover operator should preserve the statistics of the population; (2) the crossover operator should generate offspring with as much diversity as possible under the constraint of (1). By following these suggestions, the key idea of SHX is to cluster the search history and select population members from excessively generated candidate solutions by preserving the statistics represented by the clusters. Figure 1 illustrates the overview of our SHX. The proposed method is performed under the framework of RCGA which mainly involves survivor selection and crossover. Mutation is optional, but we exclude it to clearly investigate the effectiveness of SHX in this work.
Overview of the proposed method. The proposed method is performed with an archive A under the framework of RCGA. A preserves survivors Psur over the past few generations and extracts statistics from them by clustering. Offspring Poff are selected from excessively generated candidate solutions Pcan based on the statistics.
The proposed method is described in Algorithm 1. Population is denoted by P which comprises nP individuals, and the population at the t-th generation is denoted as Pt. Similarly, parents for SHX, excessively generated candidate solutions during SHX, offspring after SHX, and survivors for the next generation are represented by Ppar, Pcan, Poff, and Psur, respectively. The size of each set is denoted using n with a subscript of the set name (e.g., the size of parents is denoted by nPpar). In addition to P, our method manages an archive A which preserves nA survivors throughout the generation alternation. A and P are initialized by randomly placing individuals in the search space. The archive update process is conducted after the survivor selection. Survivor individuals Psur of the current generation are aggregated into both P and A of the next generation. SHX can be further divided into parent selection, offspring generation, and offspring selection. Different from conventional RCGA, individuals generated from Ppar are regarded as offspring candidates Pcan. The main purpose of SHX is to narrow down Pcan to nPoff individuals denoted by Poff according to the statistics provided by S. S is calculated from the clustering result of the archive and immediately impacts the offspring selection.
Algorithm 1: Search history-driven crossover for RCGA.
SHX can adopt any existing crossover operators (e.g., BLX-α [17] and SPX [21]) for the offspringGeneration function (Algorithm 1, line 8) to generate Pcan from Ppar. For the parentSelection function (Algorithm 1, line 6) and the survivorSelection function (Algorithm 1, line 11), the just generation gap (JGG) [26, 27] is employed in this work. That is, the parentSelection function randomly extracts nPpar individuals from P as Ppar, and the survivorSelection function selects top-nPsur individuals in Poff as Psur according to the fitness value. To show the performance increase brought by SHX, we choose the widely applied BLX-α, SPX, and UNDX for the offspring generation and compare the results in Section 6. We explain archiveUpdate (Algorithm 1, lines 3 and 13) and offspringSelection (Algorithm 1, line 9) in detail in Section 4 and Section 5, respectively.
4. Survivor Archive
Since the genetic operations are run alternately and iteratively, collecting and analyzing the history data may be beneficial for boosting performance. Given that SHX is to maintain the historical statistics S while producing offspring for the next generation, the archive A is designed to store Psur over few past generations and extracts statistics S. The calculation of S is based on the k-means, which is an off-the-shelf nonsupervised clustering method. The pseudocode of k-means is shown in Algorithm 2. In particular, k-means is employed to cluster the individuals in A based on their position in the search space, and S is a normalized frequency histogram to show the proportion of each cluster size to nA. A higher score indicates that the corresponding cluster is more likely to be a promising search region. The statistics can then be maintained by probabilistically assigning newly generated candidates to each cluster according to S.
Algorithm 2:k-means.
Input: number of clusters k,
Data points p1,…,pn
Output: cluster centroids c1,…,ck
k cluster centroids are randomly initialized
While termination criterion is not satisfied do
For i=1,…,n do
assign the nearest cluster centroid ID to pi
end
For i=1,…,k do
update ci by calculating the mean of data points in the i-th cluster
end
end
Return c1,…,ck
To keep the computational cost brought by k-means within an acceptable and constant range, the archive size is fixed to nA. That is, a part of individuals in A must be replaced with new survivors Psur during the archive update to incorporate new information. Two types of update methods are considered in this work: (1) randomly selecting individuals in A and replacing them with Psur (denoted by random); (2) replacing a part of A with Psur in the order in which the individuals of A arrived (denoted by sequential). The performance comparison between these two approaches is discussed in Section 6.
The update of A and calculation of S are executed in the function archiveUpdate (Algorithm 1, lines 3 and 13) which is summarized in Algorithm 3. At the replacement step (Algorithm 3, line 4), nPsur individuals are discarded from A based on random or sequential approaches, and new Psur are stored to A. Initialization is executed when t equals 0. The k-meansFit function (Algorithm 3, line 7) updates the centroids of the clusters according to the updated A and assigns updated cluster labels to each individual in A. After that, the normalized frequency histogram S for each cluster is calculated by the hist function (Algorithm 3, line 9) for further usage in offspring selection (Algorithm 4). Note that the initial centroids of the clusters in the current generation are inherited from the previous generation, as most individuals in At are the same as At−1.
Algorithm 3: Archive update.
Function: At,St= archive Update At−1,Psurt,nA.
Input: archive At−1.
Survivors Psurt.
Size of the archive nA.
Output: updated archive At,
Score St.
If t==0 then.
//initialization.
nA individuals in At are randomly initialized.
Else.
//archive update.
randomly or sequentially (first in first out) select nPsur individuals.
from At−1 to form Pdis.
At=At−1/Pdis∪Psurt.
end.
//score update.
labels=k-meansFit At.
St=histlabels; Calculating frequency histogram.
Return At,St.
Algorithm 4: Offspring selection.
Function: Pofft= offspringSelection Pcant,St−1.
Input: candidates Pcant,
Score St−1
Output: offspring Pofft
//labeling based on clustering results estimated in Algorithm 3, line 7
clusters=k-meansPredict Pcant;
//roulette construction
For i=1,…,nS do
If clustersi==∅ then
roulettei=0
Else
roulettei=Sti
end
end
//offspring selection
Repeat
select one cluster ID i by the roulette selection based on roulette
randomly select one candidate p, p∈clustersi
Pofft=Pofft∪p
//exclude selected candidate from clusters
clustersi=clustersi/p
If clustersi==∅ then.
roulettei=0:
end.
nPoff times are run.
Return Pofft.
5. Search History-Driven Crossover (SHX)
SHX randomly selects parents by following the strategy of existing crossover operators (e.g., two parents in the case of BLX and D+1 parents in the case of SPX) and excessively generates candidate offspring Pcan for further offspring selection. nPcan≫nPoff because Pcan must ensure a sufficient number of individuals that can be assigned to each cluster in A. Here, generating individuals excessively can also be considered as a mechanism of diversity preservation. It is worth pointing out that the offspring selection is a different procedure from the survivor selection. Offspring selection belongs to the crossover model and is conducted before fitness evaluation. Survivor selection is conducted after fitness evaluation. Offspring selection narrows down Pcan to Poff based on roulette wheel selection [28]. Each proportion of the wheel relates to each possible selection (i.e., clusters), and S is used to associate a probability of selection with each cluster in A. This can also be viewed as a procedure that SHX preferentially selects individuals in more “promising” regions. This bias selection can encourage the evolution of the population and accelerate the whole convergence. Besides, the statistics of the population (e.g., cluster size) can be maintained between two consecutive generations because the new generation is sampled based on the statistics of the history. Also, the diversity of Poff can be preserved because each newly generated individual from Pcan has a probability to be assigned toA.
The algorithm of offspring selection is shown in Algorithm 4. Input Pcan is excessively generated by existing crossover operators (Algorithm 1, line 8). Each candidate is labeled by the k-means Predict function (Algorithm 4, line 1) based on the current clusters estimated from A. Then, the roulette is constructed based on S. The roulette selection is called nPoff times, yielding nPoff selected offspring. Each time of roulette selection produces a cluster ID, and one candidate in Pcan that belongs to the corresponding cluster is randomly selected and assigned to Poff. To avoid duplicate selection, a selected candidate will be excluded from Pcan. If no more candidates correspond to a certain cluster (this is rarely the case by assuming nPcan≫nPoff), the roulette is reconstructed by eliminating the proportion of the corresponding cluster. Finally, Poff is passed to the survivor selection process which determines Psur using JGG.
6. Experimental Results
The performance of SHX is investigated over 15 benchmark functions, with each function in two different dimension settings. We comprehensively compare the performance of RCGA with/without SHX, and SHX is run with different settings of archive update methods (random/sequential) and offspring generation methods (BLX [17]/SPX [21]/UNDX [20]).
6.1. Experimental Setup
Benchmark functions are a useful tool to verify the effectiveness of a method, and it is general to use several functions with different properties, such as in [29, 30]. We selected 15 benchmark functions with different characteristics from the literature [31–33] for evaluation. Detailed information of each function is summarized in Table 1. Initialization of the population and the archive is conducted within the range provided by the 4th column in Table 1. It is worth mentioning that the searching space (i.e., range of parameters) during the generation alternation is not constrained. Each function is labeled according to different combination of characteristics (U + S, U + NS, M + S, and M + NS). By involving various characteristics of functions, we can analyze the proposed method more comprehensively and objectively. Furthermore, as all selected functions are adjustable in the setting of dimension, we adopt two different numbers of dimensions (D=5 and D=10) to control the difficulty degree of the search problem.
Benchmark functions f1∼f15 used in the experiments.
The last column (Label) represents the characteristics that the functions hold: unimodal (U), multimodal (M), separable (S), and nonseparable (NS).
The setting of hyperparameters of the proposed method is listed in Table 2. The proposed method includes hyperparameters of not only RCGA (number of generations, nP, and nPoff) but also SHX (nPcan, nA, and k). Basically, the search problem defined by each function becomes more hard as the number of dimensions increases, which requires a lot of evaluations. For adaptive adjustment, the number of generations, nP, and nPoff are set proportional to the number of dimensions. The constant values of each parameter are empirically determined because the purpose of the experiments is to validate the effectiveness of having SHX, rather than achieving the best solution for each function.
Hyperparameters of RCGA and SHX (nPoff, nPcan, nA,andk).
Parameter
Value
Number of generations
10D
Population size, nP
10D
Number of offspring, nPoff
6D
Number of candidates, nPcan
3nPoff
Archive size, nA
30nPsur
Number of clusters, k
⌈nA/2⌉
D is the number of dimensions of test functions. All the parameters are fixed throughout the experiments.
All experiments are executed 100 times with different random seeds. In each experiment, the generation alternation completely executed the number of generation times defined in Table 2. For a fair comparison, iterations under the same random seed start using the same population. The runtime and fitness are recorded with Python implementation (without either parallelization or optimization) on a i7-7700 CPU at 3.60 GHz, 12.0 GB RAM desktop computer.
6.2. Comparison in the Final-Generation-Elite
The results of the absolute error between the optimal value and the final-generation-elite fitness with respect to all combinations of functions, dimension, and methods are displayed in Table 3. Table 3 shows the minimum, maximum, median, mean, standard deviation (SD), and p value of the Mann–Whitney U test by each combination. The Mann–Whitney U test evaluates the significance of SHX results against results without SHX under the significance level p=0.05. Before showing the superiority by involving SHX, we first exclude a few results that all the methods are trapped by local optima or cannot reach the global optima. (1) Easom Functionf8. This function has several local minima. It is unimodal, and the global minimum only has a small area corresponding to the search space, which can be hardly arrived at. (2) Schwefel 2.26f10. Since the setup of this experiment does not restrict the range of parameters during search, an extremely small fitness value (even smaller than the global optimum) can be achieved with this function, which is not suitable for comparisons.
The results of the absolute error between the optimal value and the final-generation-elite fitness over 100 runs.
f
D
Minimum
MaximumMean
Minimum
Maximum
Minimum
Maximum
Median
Median
Mean
Median
Mean
SD
SD
p value
SD
p value
BLX
SH-BLX_random
SH-BLX_sequential
f1
5
3.37E + 00
1.58E + 01
2.07E + 00
1.51E + 01
2.45E + 00
1.36E + 01
8.82E + 00
8.92E + 00
7.27E + 00
7.26E + 00
7.13E + 00
7.35E + 00
±2.87E + 00
±2.40E + 00
2.76E − 05
±2.29E + 00
6.54E − 05
10
1.03E + 01
2.58E + 01
8.60E + 00
2.28E + 01
9.20E + 00
2.68E + 01
1.80E + 01
1.84E + 01
1.72E + 01
1.68E + 01
1.66E + 01
1.69E + 01
±3.20E + 00
±2.90E + 00
1.11E − 03
±3.36E + 00
8.49E − 04
f2
5
1.07E − 01
3.40E + 00
1.19E − 02
1.69E + 00
7.60E − 02
2.14E + 00
7.75E − 01
9.28E − 01
5.67E − 01
5.57E − 01
4.47E − 01
5.63E − 01
±6.39E − 01
±3.78E − 01
3.07E − 06
±4.11E − 01
1.67E − 06
10
1.02E + 00
1.59E + 01
9.51E − 01
9.47E + 00
1.29E + 00
8.85E + 00
5.02E + 00
5.33E + 00
4.14E + 00
4.24E + 00
3.79E + 00
4.09E + 00
±2.18E + 00
±1.68E + 00
8.78E − 05
±1.53E + 00
3.00E − 06
f3
5
1.61E − 01
9.02E + 00
1.51E − 01
9.28E + 00
1.58E − 01
5.74E + 00
2.08E + 00
2.62E + 00
1.24E + 00
1.60E + 00
1.35E + 00
1.69E + 00
±1.66E + 00
±1.33E + 00
6.50E − 08
±1.26E + 00
1.17E − 06
10
4.57E + 00
5.66E + 01
2.41E + 00
4.86E + 01
4.04E + 00
4.56E + 01
2.47E + 01
2.60E + 01
1.84E + 01
2.11E + 01
2.02E + 01
2.06E + 01
±1.02E + 01
±9.57E + 00
1.33E − 04
±8.79E + 00
1.90E − 04
f4
5
6.96E + 01
1.94E + 04
1.21E + 02
8.83E + 03
7.78E + 01
8.08E + 03
1.94E + 03
2.80E + 03
8.98E + 02
1.74E + 03
8.68E + 02
1.40E + 03
±2.80E + 03
±2.00E + 03
7.07E − 05
±1.45E + 03
9.29E − 07
10
6.89E + 03
3.50E + 05
4.44E + 03
2.12E + 05
4.89E + 03
1.56E + 05
6.11E + 04
7.19E + 04
4.69E + 04
5.35E + 04
3.38E + 04
4.21E + 04
±5.19E + 04
±3.91E + 04
1.73E − 03
±3.01E + 04
2.16E − 07
f5
5
2.45E + 01
5.58E + 02
1.81E + 01
6.65E + 02
7.61E + 00
4.54E + 02
1.78E + 02
1.93E + 02
1.30E + 02
1.63E + 02
1.29E + 02
1.45E + 02
±1.15E + 02
±1.18E + 02
9.22E − 03
±8.49E + 01
1.15E − 03
10
4.00E + 02
2.80E + 03
3.94E + 02
2.29E + 03
2.98E + 02
2.32E + 03
1.40E + 03
1.37E + 03
1.12E + 03
1.18E + 03
1.16E + 03
1.22E + 03
±4.28E + 02
±3.99E + 02
6.21E − 04
±4.39E + 02
6.82E − 03
f6
5
4.00E + 00
2.12E + 02
2.06E + 00
8.97E + 01
2.70E + 00
8.71E + 01
2.52E + 01
3.36E + 01
1.49E + 01
2.05E + 01
1.56E + 01
1.87E + 01
±3.25E + 01
±1.62E + 01
4.48E − 06
±1.32E + 01
2.58E − 07
10
4.03E + 01
1.17E + 06
1.63E + 01
1.57E + 05
3.60E + 01
8.26E + 05
3.85E + 03
3.75E + 04
9.76E + 02
9.57E + 03
1.09E + 03
1.31E + 04
±1.33E + 05
±2.77E + 04
5.04E − 05
±8.24E + 04
3.11E − 04
f7
5
2.84E − 01
1.22E + 01
4.40E − 01
1.09E + 01
3.23E − 01
1.02E + 01
3.33E + 00
3.87E + 00
2.48E + 00
2.93E + 00
2.16E + 00
2.76E + 00
±2.52E + 00
±1.97E + 00
3.21E − 03
±1.99E + 00
5.01E − 04
10
7.23E + 00
4.17E + 01
7.08E + 00
4.04E + 01
5.92E + 00
4.00E + 01
2.39E + 01
2.32E + 01
1.98E + 01
2.04E + 01
1.95E + 01
1.94E + 01
±7.22E + 00
±6.47E + 00
1.49E − 03
±6.75E + 00
7.14E − 05
f8
5
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
10
8.70E − 01
1.00E + 00
4.90E − 01
1.00E + 00
9.16E − 01
1.00E + 00
1.00E + 00
9.95E − 01
1.00E + 00
9.86E − 01
1.00E + 00
9.93E − 01
±1.87E − 02
±6.23E − 02
3.65E − 06
±1.41E − 02
7.74E − 07
f9
5
4.50E + 00
2.22E + 01
3.68E + 00
2.01E + 01
4.74E + 00
2.13E + 01
1.28E + 01
1.34E + 01
1.15E + 01
1.15E + 01
1.13E + 01
1.14E + 01
±3.52E + 00
±3.40E + 00
2.35E − 04
±3.46E + 00
5.41E − 05
10
2.44E + 01
5.91E + 01
2.61E + 01
5.87E + 01
2.61E + 01
5.60E + 01
4.44E + 01
4.40E + 01
4.31E + 01
4.22E + 01
4.22E + 01
4.24E + 01
±6.68E + 00
±6.01E + 00
1.93E − 02
±7.00E + 00
6.44E − 02
f10
5
—
—
—
—
6.18E + 01
9.74E + 02
—
—
—
—
5.93E + 02
5.75E + 02
—
—
—
±1.78E + 02
2.82E − 39
10
5.69E + 02
2.15E + 03
5.93E + 02
2.23E + 03
7.63E + 02
2.24E + 03
1.77E + 03
1.71E + 03
1.78E + 03
1.76E + 03
1.85E + 03
1.79E + 03
±2.62E + 02
±2.45E + 02
8.94E − 01
±2.35E + 02
9.91E − 01
f11
5
4.10E + 01
4.31E + 01
4.06E + 01
4.22E + 01
4.08E + 01
4.25E + 01
4.18E + 01
4.18E + 01
4.15E + 01
4.15E + 01
4.14E + 01
4.14E + 01
±4.15E − 01
±3.47E − 01
9.67E − 08
±3.11E − 01
2.23E − 10
10
1.83E + 02
1.87E + 02
1.83E + 02
1.86E + 02
1.83E + 02
1.86E + 02
1.85E + 02
1.85E + 02
1.85E + 02
1.85E + 02
1.85E + 02
1.85E + 02
±6.42E − 01
±6.27E − 01
2.70E − 06
±6.55E − 01
2.38E − 08
f12
5
3.70E + 00
9.50E + 00
2.80E + 00
8.91E + 00
2.88E + 00
9.80E + 00
6.69E + 00
6.82E + 00
5.68E + 00
5.66E + 00
5.84E + 00
5.87E + 00
±1.46E + 00
±1.30E + 00
1.19E − 07
±1.23E + 00
2.67E − 06
10
5.99E + 00
1.30E + 01
6.06E + 00
1.13E + 01
5.01E + 00
1.24E + 01
1.00E + 01
9.84E + 00
8.95E + 00
9.13E + 00
9.08E + 00
9.12E + 00
±1.23E + 00
±1.12E + 00
4.53E − 06
±1.39E + 00
7.36E − 05
f13
5
2.97E − 01
9.57E − 01
1.51E − 01
8.28E − 01
1.60E − 01
7.97E − 01
5.41E − 01
5.59E − 01
5.08E − 01
4.92E − 01
4.77E − 01
4.85E − 01
±1.27E − 01
±1.44E − 01
1.16E − 03
±1.36E − 01
9.77E − 05
10
8.82E − 01
1.28E + 00
7.98E − 01
1.24E + 00
6.04E − 01
1.23E + 00
1.11E + 00
1.11E + 00
1.08E + 00
1.08E + 00
1.07E + 00
1.06E + 00
±7.04E − 02
±7.46E − 02
1.14E − 03
±9.55E − 02
1.91E − 05
f14
5
3.09E − 01
2.52E + 00
4.48E − 01
2.03E + 00
3.56E − 01
2.63E + 00
1.34E + 00
1.37E + 00
1.20E + 00
1.18E + 00
1.20E + 00
1.19E + 00
±3.93E − 01
±3.52E − 01
3.88E − 04
±3.59E − 01
2.14E − 04
10
1.73E + 00
4.20E + 00
1.80E + 00
4.22E + 00
1.58E + 00
3.68E + 00
2.96E + 00
2.97E + 00
2.72E + 00
2.71E + 00
2.71E + 00
2.66E + 00
±4.74E − 01
±4.77E − 01
6.94E − 05
±4.66E − 01
5.25E − 06
f15
5
5.04E − 02
1.55E − 01
4.53E − 02
1.21E − 01
4.44E − 02
1.08E − 01
7.84E − 02
8.18E − 02
7.22E − 02
7.21E − 02
6.74E − 02
6.88E − 02
±1.99E − 02
±1.59E − 02
2.67E − 04
±1.44E − 02
5.03E − 07
10
1.86E − 03
1.27E − 02
1.17E − 03
1.07E − 02
1.26E − 03
1.12E − 02
5.75E − 03
5.66E − 03
5.28E − 03
5.43E − 03
4.93E − 03
5.12E − 03
±2.04E − 03
±2.03E − 03
2.58E − 01
±1.95E − 03
2.84E − 02
SPX
SH-SPX_random
SH-SPX_sequential
f1
5
2.73E − 01
2.23E + 00
1.71E − 01
1.48E + 00
1.83E − 01
2.99E + 00
9.30E − 01
9.76E − 01
6.46E − 01
7.10E − 01
5.77E − 01
6.90E − 01
±4.08E − 01
±2.74E − 01
3.92E − 07
±4.21E − 01
5.93E − 09
10
2.11E − 01
8.83E − 01
1.76E − 01
9.73E − 01
1.64E − 01
8.63E − 01
4.73E − 01
5.03E − 01
3.55E − 01
3.78E − 01
2.81E − 01
3.07E − 01
±1.29E − 01
±1.26E − 01
4.24E − 13
±1.22E − 01
3.26E − 21
f2
5
3.00E − 03
6.10E − 02
1.09E − 03
6.74E − 02
3.67E − 04
3.57E − 02
1.54E − 02
1.97E − 02
7.82E − 03
1.11E − 02
4.51E − 03
7.39E − 03
±1.33E − 02
±9.97E − 03
2.63E − 09
±6.87E − 03
5.70E − 17
10
9.45E − 04
1.11E − 02
5.33E − 04
7.35E − 03
1.68E − 04
5.32E − 03
3.70E − 03
4.22E − 03
1.64E − 03
1.89E − 03
1.08E − 03
1.30E − 03
±2.06E − 03
±1.07E − 03
1.29E − 20
±8.56E − 04
2.16E − 27
f3
5
4.02E − 03
4.15E − 01
2.48E − 03
3.84E − 01
2.41E − 03
4.92E − 01
4.10E − 02
5.50E − 02
2.21E − 02
3.68E − 02
1.86E − 02
4.22E − 02
±5.67E − 02
±5.17E − 02
6.57E − 06
±7.39E − 02
7.37E − 07
10
5.20E − 03
8.20E − 02
2.42E − 03
1.88E − 01
1.66E − 03
1.36E − 01
2.07E − 02
2.31E − 02
1.31E − 02
1.93E − 02
6.25E − 03
1.38E − 02
±1.23E − 02
±2.19E − 02
1.04E − 05
±2.30E − 02
4.27E − 17
f4
5
5.72E + 00
1.10E + 03
2.49E + 00
6.26E + 02
2.50E + 00
2.62E + 03
2.12E + 01
5.79E + 01
1.80E + 01
6.39E + 01
1.27E + 01
7.01E + 01
±1.29E + 02
±1.18E + 02
1.74E − 01
±2.77E + 02
1.89E − 05
10
1.04E + 01
8.49E + 01
8.69E + 00
2.67E + 02
7.58E + 00
3.46E + 02
1.64E + 01
1.88E + 01
1.19E + 01
2.13E + 01
1.06E + 01
1.95E + 01
±9.64E + 00
±3.56E + 01
6.19E − 11
±3.77E + 01
3.70E − 14
f5
5
3.63E − 01
3.58E + 01
1.59E − 01
1.27E + 02
7.85E − 02
3.06E + 01
2.12E + 00
3.44E + 00
1.92E + 00
5.01E + 00
1.58E + 00
4.20E + 00
±4.57E + 00
±1.41E + 01
9.05E − 02
±6.01E + 00
2.58E − 02
10
2.80E − 01
6.40E + 00
7.15E − 02
6.69E + 00
4.67E − 02
1.13E + 01
9.01E − 01
1.20E + 00
5.28E − 01
8.65E − 01
5.03E − 01
1.19E + 00
±1.04E + 00
±9.14E − 01
6.06E − 07
±1.79E + 00
9.06E − 06
f6
5
1.46E + 00
2.49E + 01
5.51E − 01
1.84E + 01
1.93E − 01
1.90E + 01
5.66E + 00
6.28E + 00
3.13E + 00
3.72E + 00
2.71E + 00
4.45E + 00
±3.41E + 00
±2.47E + 00
1.05E − 15
±4.19E + 00
2.78E − 10
10
2.16E + 00
1.11E + 01
1.42E + 00
3.24E + 01
9.44E − 01
8.43E + 00
4.13E + 00
4.37E + 00
2.65E + 00
3.26E + 00
2.03E + 00
2.42E + 00
±1.35E + 00
±3.25E + 00
1.25E − 17
±1.32E + 00
1.58E − 21
f7
5
5.87E − 03
1.12E + 00
4.99E − 03
1.34E + 00
3.09E − 03
3.30E + 00
8.86E − 02
1.65E − 01
5.66E − 02
1.38E − 01
8.61E − 02
2.95E − 01
±1.96E − 01
±2.15E − 01
3.69E − 03
±5.54E − 01
1.90E − 01
10
1.36E − 02
1.19E + 00
4.93E − 03
1.22E + 00
4.72E − 03
1.86E + 00
6.32E − 02
1.24E − 01
2.42E − 02
7.25E − 02
3.80E − 02
1.35E − 01
±1.66E − 01
±1.55E − 01
7.46E − 09
±2.49E − 01
2.72E − 03
f8
5
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
10
3.58E − 03
1.92E − 01
1.20E − 03
9.98E − 01
9.28E − 04
9.98E − 01
1.73E − 02
2.54E − 02
6.69E − 03
5.27E − 02
8.29E − 03
6.27E − 02
±2.82E − 02
±1.88E − 01
1.08E − 10
±1.96E − 01
9.22E − 05
f9
5
2.42E + 00
2.32E + 01
9.43E − 01
2.03E + 01
6.68E − 01
2.03E + 01
1.12E + 01
1.18E + 01
1.02E + 01
1.02E + 01
8.62E + 00
8.86E + 00
±3.95E + 00
±3.89E + 00
4.58E − 03
±3.45E + 00
8.84E − 09
10
1.44E + 01
4.54E + 01
3.53E + 00
3.65E + 01
8.75E − 01
2.79E + 01
3.47E + 01
3.37E + 01
1.52E + 01
1.71E + 01
8.14E + 00
9.53E + 00
±6.66E + 00
±7.99E + 00
6.39E − 27
±5.97E + 00
1.87E − 33
f10
5
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
10
—
—
—
—
9.16E + 00
2.86E + 03
—
—
—
—
1.88E + 03
1.83E + 03
—
—
—
±4.94E + 02
2.82E − 39
f11
5
4.04E + 01
4.13E + 01
4.03E + 01
4.15E + 01
4.02E + 01
4.11E + 01
4.07E + 01
4.07E + 01
4.06E + 01
4.06E + 01
4.05E + 01
4.05E + 01
±1.90E − 01
±1.60E − 01
4.35E − 10
±1.65E − 01
8.70E − 15
10
1.81E + 02
1.81E + 02
1.80E + 02
1.81E + 02
1.80E + 02
1.81E + 02
1.81E + 02
1.81E + 02
1.81E + 02
1.81E + 02
1.81E + 02
1.81E + 02
±1.29E − 01
±1.08E − 01
4.36E − 18
±1.28E − 01
1.14E − 25
f12
5
4.05E − 01
3.28E + 00
2.13E − 01
3.14E + 00
1.97E − 01
2.65E + 00
1.87E + 00
1.84E + 00
1.45E + 00
1.49E + 00
1.12E + 00
1.23E + 00
±5.88E − 01
±6.29E − 01
4.46E − 05
±5.57E − 01
1.65E − 11
10
2.67E − 01
1.39E + 00
1.57E − 01
2.07E + 00
8.07E − 02
5.88E − 01
5.83E − 01
6.16E − 01
3.50E − 01
3.96E − 01
2.36E − 01
2.45E − 01
±2.25E − 01
±2.25E − 01
2.84E − 16
±9.53E − 02
4.47E − 30
f13
5
1.04E − 01
6.17E − 01
4.94E − 02
5.92E − 01
9.47E − 02
5.98E − 01
3.49E − 01
3.52E − 01
3.12E − 01
3.20E − 01
3.31E − 01
3.27E − 01
±1.11E − 01
±1.12E − 01
2.86E − 02
±1.09E − 01
7.84E − 02
10
6.96E − 02
4.84E − 01
1.43E − 02
3.90E − 01
1.22E − 02
2.88E − 01
1.93E − 01
2.08E − 01
7.37E − 02
9.07E − 02
5.03E − 02
6.06E − 02
±1.00E − 01
±6.20E − 02
6.30E − 20
±4.44E − 02
6.33E − 28
f14
5
2.00E − 01
7.73E − 01
1.21E − 01
7.01E − 01
2.00E − 01
7.10E − 01
4.13E − 01
4.24E − 01
4.00E − 01
3.98E − 01
4.00E − 01
3.86E − 01
±1.39E − 01
±1.30E − 01
1.16E − 01
±1.22E − 01
3.43E − 02
10
2.00E − 01
7.03E − 01
2.00E − 01
5.14E − 01
2.00E − 01
6.16E − 01
3.65E − 01
3.66E − 01
3.10E − 01
3.25E − 01
3.08E − 01
3.35E − 01
±8.08E − 02
±6.94E − 02
2.20E − 04
±9.77E − 02
1.57E − 03
f15
5
4.37E − 02
1.27E − 01
4.24E − 02
1.32E − 01
4.28E − 02
8.59E − 02
7.25E − 02
7.30E − 02
5.54E − 02
5.82E − 02
5.12E − 02
5.45E − 02
±1.63E − 02
±1.36E − 02
2.32E − 12
±9.05E − 03
5.28E − 18
10
1.08E − 03
9.64E − 03
6.29E − 04
5.55E − 03
6.01E − 04
3.29E − 03
4.23E − 03
4.38E − 03
1.22E − 03
1.52E − 03
8.99E − 04
1.01E − 03
±1.89E − 03
±8.58E − 04
2.60E − 27
±4.11E − 04
2.82E − 33
UNDX
SH-UNDX_random
SH-UNDX_sequential
f1
5
1.20E + 00
6.82E + 00
8.26E − 01
5.16E + 00
7.35E − 01
5.54E + 00
2.77E + 00
2.83E + 00
2.18E + 00
2.30E + 00
1.96E + 00
2.13E + 00
±1.09E + 00
±8.41E − 01
1.60E − 04
±8.74E − 01
3.29E − 07
10
2.04E + 00
6.43E + 00
1.36E + 00
5.53E + 00
1.71E + 00
5.33E + 00
4.25E + 00
4.25E + 00
3.65E + 00
3.66E + 00
3.54E + 00
3.50E + 00
±1.01E + 00
±7.54E − 01
9.57E − 06
±7.34E − 01
2.52E − 08
f2
5
1.30E − 02
6.07E − 01
1.24E − 02
4.28E − 01
4.48E − 03
6.26E − 01
1.30E − 01
1.60E − 01
6.85E − 02
8.27E − 02
7.20E − 02
8.83E − 02
±1.16E − 01
±6.00E − 02
7.89E − 09
±8.50E − 02
1.94E − 08
10
9.42E − 02
1.26E + 00
7.07E − 02
6.85E − 01
4.32E − 02
6.42E − 01
4.10E − 01
4.74E − 01
2.98E − 01
3.03E − 01
2.49E − 01
2.68E − 01
±2.39E − 01
±1.43E − 01
3.09E − 08
±1.22E − 01
2.28E − 12
f3
5
4.93E − 02
1.77E + 00
1.51E − 02
1.81E + 00
2.90E − 02
8.85E − 01
3.45E − 01
4.71E − 01
2.17E − 01
2.96E − 01
2.03E − 01
2.71E − 01
±3.48E − 01
±2.77E − 01
2.17E − 06
±2.04E − 01
6.85E − 07
10
3.27E − 01
5.08E + 00
2.47E − 01
6.39E + 00
4.47E − 01
4.88E + 00
2.30E + 00
2.30E + 00
1.47E + 00
1.72E + 00
1.49E + 00
1.69E + 00
±9.70E − 01
±9.66E − 01
6.77E − 07
±8.96E − 01
7.28E − 07
f4
5
4.89E + 00
1.54E + 03
6.89E + 00
1.21E + 03
6.88E + 00
1.17E + 03
1.51E + 02
2.41E + 02
8.81E + 01
1.93E + 02
1.00E + 02
1.60E + 02
±2.70E + 02
±2.38E + 02
2.12E − 02
±1.80E + 02
8.40E − 03
10
1.04E + 02
3.64E + 03
8.64E + 01
1.54E + 03
9.04E + 01
3.07E + 03
6.38E + 02
7.67E + 02
4.29E + 02
4.78E + 02
3.65E + 02
4.71E + 02
±5.61E + 02
±2.72E + 02
5.55E − 06
±4.01E + 02
6.08E − 08
f5
5
6.70E − 01
8.64E + 01
2.11E + 00
2.42E + 02
2.50E + 00
6.09E + 01
2.07E + 01
2.40E + 01
1.58E + 01
2.33E + 01
1.49E + 01
1.80E + 01
±1.64E + 01
±2.85E + 01
4.19E − 02
±1.30E + 01
3.00E − 03
10
2.25E + 01
2.33E + 02
2.88E + 01
2.05E + 02
1.91E + 01
2.01E + 02
1.02E + 02
1.04E + 02
7.97E + 01
8.55E + 01
8.47E + 01
8.99E + 01
±4.18E + 01
±3.30E + 01
2.52E − 04
±3.96E + 01
4.88E − 03
f6
5
2.16E + 00
4.98E + 01
3.22E + 00
4.47E + 01
1.94E + 00
5.04E + 01
1.44E + 01
1.62E + 01
9.27E + 00
1.16E + 01
1.22E + 01
1.28E + 01
±9.25E + 00
±6.70E + 00
3.06E − 05
±7.34E + 00
2.07E − 03
10
1.54E + 01
3.08E + 03
1.46E + 01
5.00E + 02
1.39E + 01
1.19E + 03
5.27E + 01
1.50E + 02
3.49E + 01
5.30E + 01
3.08E + 01
5.44E + 01
±3.87E + 02
±7.00E + 01
1.12E − 05
±1.22E + 02
4.08E − 09
f7
5
1.00E − 01
5.44E + 00
2.71E − 02
3.31E + 00
2.87E − 02
5.63E + 00
8.04E − 01
1.16E + 00
5.35E − 01
8.39E − 01
5.39E − 01
9.49E − 01
±1.02E + 00
±7.72E − 01
3.33E − 03
±9.58E − 01
1.35E − 02
10
1.84E + 00
1.60E + 01
2.16E + 00
1.40E + 01
8.81E − 01
1.48E + 01
6.18E + 00
6.60E + 00
5.08E + 00
5.83E + 00
5.21E + 00
5.65E + 00
±2.97E + 00
±2.90E + 00
1.62E − 02
±3.05E + 00
5.28E − 03
f8
5
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
10
8.00E − 01
1.00E + 00
7.95E − 01
1.00E + 00
4.05E − 01
1.00E + 00
1.00E + 00
9.92E − 01
9.96E − 01
9.78E − 01
9.99E − 01
9.72E − 01
±2.60E − 02
±4.13E − 02
6.41E − 08
±8.29E − 02
1.42E − 04
f9
5
3.55E + 00
1.95E + 01
3.51E + 00
1.96E + 01
1.81E + 00
1.89E + 01
1.06E + 01
1.10E + 01
1.10E + 01
1.09E + 01
1.00E + 01
1.01E + 01
±3.58E + 00
±3.30E + 00
4.28E − 01
±3.50E + 00
4.51E − 02
10
1.74E + 01
4.84E + 01
1.95E + 01
4.38E + 01
1.70E + 01
4.69E + 01
3.64E + 01
3.64E + 01
3.45E + 01
3.40E + 01
3.45E + 01
3.43E + 01
±6.45E + 00
±5.52E + 00
1.98E − 03
±6.07E + 00
7.45E − 03
f10
5
—
—
—
—
6.48E + 01
1.16E + 03
—
—
—
—
7.66E + 02
7.24E + 02
—
—
—
±2.55E + 02
2.82E − 39
10
—
—
—
—
5.25E + 02
2.48E + 03
—
—
—
—
1.70E + 03
1.60E + 03
—
—
—
±4.68E + 02
2.82E − 39
f11
5
4.06E + 01
4.22E + 01
4.06E + 01
4.19E + 01
4.05E + 01
4.19E + 01
4.12E + 01
4.13E + 01
4.11E + 01
4.11E + 01
4.11E + 01
4.11E + 01
±3.29E − 01
±2.95E − 01
2.24E − 04
±2.75E − 01
4.55E − 04
10
1.82E + 02
1.84E + 02
1.82E + 02
1.84E + 02
1.82E + 02
1.84E + 02
1.83E + 02
1.83E + 02
1.83E + 02
1.83E + 02
1.83E + 02
1.83E + 02
±4.72E − 01
±4.03E − 01
5.09E − 07
±4.09E − 01
2.62E − 08
f12
5
2.04E + 00
5.83E + 00
1.69E + 00
4.95E + 00
1.27E + 00
5.92E + 00
3.89E + 00
3.81E + 00
3.23E + 00
3.24E + 00
3.30E + 00
3.26E + 00
±7.79E − 01
±6.47E − 01
1.07E − 07
±8.19E − 01
1.42E − 06
10
3.04E + 00
6.14E + 00
2.47E + 00
5.22E + 00
1.85E + 00
5.06E + 00
4.32E + 00
4.38E + 00
3.87E + 00
3.85E + 00
3.81E + 00
3.81E + 00
±6.26E − 01
±5.59E − 01
2.45E − 08
±5.87E − 01
4.85E − 09
f13
5
1.54E − 01
6.60E − 01
1.07E − 01
5.25E − 01
1.55E − 01
6.38E − 01
3.74E − 01
3.81E − 01
3.73E − 01
3.64E − 01
3.39E − 01
3.55E − 01
±1.06E − 01
±9.63E − 02
2.13E − 01
±1.01E − 01
3.81E − 02
10
4.11E − 01
9.76E − 01
4.01E − 01
9.11E − 01
4.77E − 01
8.96E − 01
8.37E − 01
8.09E − 01
7.95E − 01
7.73E − 01
7.59E − 01
7.47E − 01
±1.06E − 01
±9.53E − 02
9.46E − 04
±1.06E − 01
2.73E − 06
f14
5
2.01E − 01
1.13E + 00
2.15E − 01
9.63E − 01
1.46E − 01
1.03E + 00
7.05E − 01
7.00E − 01
5.80E − 01
5.79E − 01
5.41E − 01
5.68E − 01
±1.94E − 01
±1.72E − 01
6.01E − 06
±1.83E − 01
1.40E − 06
10
6.09E − 01
1.69E + 00
5.27E − 01
1.40E + 00
5.00E − 01
1.31E + 00
1.00E + 00
1.04E + 00
9.04E − 01
9.09E − 01
8.95E − 01
8.80E − 01
±2.16E − 01
±1.72E − 01
1.79E − 05
±1.58E − 01
9.29E − 08
f15
5
4.42E − 02
1.08E − 01
4.62E − 02
1.09E − 01
4.67E − 02
1.13E − 01
7.04E − 02
7.25E − 02
6.97E − 02
7.06E − 02
6.76E − 02
6.93E − 02
±1.45E − 02
±1.34E − 02
1.99E − 01
±1.39E − 02
6.66E − 02
10
1.44E − 03
7.63E − 03
1.89E − 03
8.04E − 03
1.35E − 03
8.53E − 03
4.31E − 03
4.33E − 03
4.02E − 03
4.24E − 03
4.12E − 03
4.23E − 03
±1.30E − 03
±1.31E − 03
2.75E − 01
±1.38E − 03
2.68E − 01
The best results in each row are emphasized in bold. The emphasized p values in bold indicate that the Mann–Whitney U test with the significance level p=0.05 shows significance against the result without SHX. “—” represents invalid solutions (trapped by local optimum or out of parameter range
From Table 3, we can observe the clear improvement of performance brought by SHX. The results of the p value show that the methods with SHX have recognized the significance at least in 23 settings among all 30 settings. In the other five results (minimum, maximum, median, mean, and SD), the methods without SHX cannot achieve outperformed results for most settings. For instance, focusing on the minimum results, the methods without SHX outperform the methods with SHX only 5, 0, and 4 times by BLX, SPX, and UNDX, respectively. On the other hand, SHX with sequential archive update achieves the best performance. SH-BLX_sequential, SH-SPX_sequential, and SH-UNDX_sequential show the significance in 27, 26, and 27 settings, respectively. In addition, they achieve the best results in most settings with respect to the maximum, median, and mean results. One possible reason for sequential outperforming random in most cases is that sequential removes the oldest individual which arrived first, and therefore SHX can select offspring according to the up-to-date search history to reflect the trend of evolution more sensitively. In contrast, random uniformly removes individuals in the archive, which may impede the discovery of new solutions since old individuals may be retained for more generations in the archive.
6.2.1. Analysis on BLX vs. SH-BLX
It has been already known that the standard BLX [17] faces difficulties especially when the target function is nonseparable [34] due to the parameter-wise sampling. By observing the results of f4 to f7 and f12 to f15 from Table 3, we can find that involving SHX significantly improves the performance, which indicates that SHX can help BLX to greatly mitigate this drawback. It is easy to understand because offspring selection with clusters embeds distance measure which builds the relationship among parameters.
6.2.2. Analysis on SPX vs. SH-SPX
SPX [21] is a better alternative of BLX, and we can observe from Table 3 that SPX noticeably outperforms BLX. From Table 3, it is also very clear that SHX further boosts the performance of SPX to a large extent. In particular, the results of minimum and median are improved by involving SHX for all settings. As pointed out in [21], SPX has the ability to maintain the mean and covariance of the parent individuals, which is consistent with the design guideline of good crossover operators mentioned in Section 3. Since SHX manages an archive that stores search history over few generations, it can preserve some useful statistics (e.g., centroids of clusters) much longer. That is why SHX is able to enhance SPX.
6.2.3. Analysis on UNDX vs. SH-UNDX
Similar to BLX and SPX, Table 3 shows that the results involving SHX are improved in most settings. UNDX is also designed to generate offspring inheriting the distribution of the parent individuals [35]. Therefore, statistics of the search history provided by SHX are useful for UNDX to enhance search ability.
6.3. Comparison in Convergence Curve
With the aid of search history, SHX not only achieves better results but also improves the convergence speed. In this section, we compare the generation alternation for over all the test functions in the case of D=10. Evaluation values of elite individuals from the 1st generation to the 100th generation are plotted in Figure 2. The mean value of 100 trials is represented by the line, and the range between the minimum and the maximum is represented by the shaded area. Smaller area means more stable search. It should be noted that as the ranges of parameters are not constrained during the search procedure, methods can achieve infinitely small values of fitness, and a lower value does not mean a better result in the case of f10, as explained in Section 6.2.
Convergence curves of all test functions. Each mean-min-max curve has a corresponding shaded area to represent the range of changes over 100 trials with different random seeds.
For BLX, SPX, and UNDX, exploiting SHX shows faster convergence speed comparing against them without SHX in most cases. The superiority becomes more obvious when the problem setting is more difficult (e.g., multimodal functions f8−15 vs. unimodal functions f1−7).
6.4. Comparison in Processing Time
In this section, we show the runtime overhead of the processing brought by SHX. Figure 3 shows the comparisons in processing time of an optimization task (D=10 and a single fitness evaluation takes 0.01 second) for BLX and SPX. The parameter setting follows Table 2, and all the results are averaged over 10 trials. It took 93.9 seconds and 94.1 seconds for BLX and SPX to complete the entire process, respectively. SH-BLX_random took additional 1.7 seconds to BLX. SH-BLX_sequential took 1.6 seconds more than BLX. Similarly, the additional runtime for SH-SPX_random and SH-SPX_sequential to SPX were 3.9 seconds and 3.9 seconds, respectively. These numbers demonstrate the additional runtime only occupies a small part of the total processing time. These additional computational costs mainly occur in the clustering with archive data and the label assignment with candidate offspring. The cost can be further reduced by fusing efficient distance measure or parallel computing. For a fixed size of an archive, the runtime grows linearly with the increase in the number of generations. Considering the complexity of the fitness function and the budget, SHX is a practical alternative to other crossover models.
Processing time of different methods with increasing number of generations. The computational time is 0.01 second for a single evaluation.
7. Conclusions
In this paper, we have proposed a novel crossover model (SHX) which is simple yet effective and efficient. It can be easily integrated with any existing crossover operators. The key idea is to exploit search history over generations to gain useful information for generating offspring. Experimental results demonstrate that our SHX can significantly boost the performance of existing crossovers, in terms of the final solution and the convergence speed. Also, according to experiments, the induced extra runtime is negligible compared to the total processing time.
SHX still has a few limitations. (1) Additional hyperparameters need to be determined. (2) The induced additional runtime may be unable to sufficiently support applications which require high processing speed. As the future work, we would like to address the above limitations. For instance, hyperparameters can be adaptively set by considering specific contexts, and parallelization can be introduced to speed up SHX.
Data Availability
The test data used to support the findings of this study are included within the article.
Disclosure
A preliminary version of this work appears in GECCO2020 and has also been mentioned in the manuscript which can be viewed at the following link: https://arxiv.org/abs/2003.13508.
Conflicts of Interest
The authors declare that they have no conflicts of interest.
Acknowledgments
This work was partly supported by JSPS KAKENHI, Grant number (JP18K17823).
WrightA. H.Genetic algorithms for real parameter optimization19911Amsterdam, NetherlandsElsevier205218HerreraF.LozanoM.VerdegayJ. L.Tackling real-coded genetic algorithms: operators and tools for behavioural analysis199812426531910.1023/a:10065049011642-s2.0-0032136585WhitleyD.An overview of evolutionary algorithms: practical issues and common pitfalls2001431481783110.1016/s0950-5849(01)00188-42-s2.0-0035892558SholomonD.DavidO.NetanyahuN. S.A genetic algorithm-based solver for very large jigsaw puzzlesProceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR)June 2013Portland, OR, USA1767177410.1109/cvpr.2013.2312-s2.0-84887355756XieL.YuilleA.Genetic CNNProceedings of the 2017 IEEE International Conference on Computer Vision (ICCV)October 2017Venice, Italy1379138810.1109/ICCV.2017.1542-s2.0-85041918559RealE.MooreS.SelleA.Lare-scale evolution of image classifiersProceedings of the International Conference on Machine Learning (ICML)August 2017Sydney, Australia29022911DeepK.ThakurM.A new crossover operator for real coded genetic algorithms2007188189591110.1016/j.amc.2006.10.0472-s2.0-34247573442García-MartínezC.LozanoM.HerreraF.MolinaD.SánchezA. M.Global and local real-coded genetic algorithms based on parent-centric crossover operators200818531088111310.1016/j.ejor.2006.06.0432-s2.0-34848832935TangP.-H.TsengM.-H.Adaptive directed mutation for real-coded genetic algorithms201313160061410.1016/j.asoc.2012.08.0352-s2.0-84869478345PicekS.JakobovicD.GolubM.On the recombination operator in the real-coded genetic algorithmsProceedings of the IEEE Congress on Evolutionary Computation (CEC)June 2013Cancun, MexicoIEEE3103311010.1109/cec.2013.65579482-s2.0-84881606002ChuangY.-C.ChenC.-T.HwangC.A simple and efficient real-coded genetic algorithm for constrained optimization2016388710510.1016/j.asoc.2015.09.0362-s2.0-84944789843HerreraF.LozanoM.SánchezA. M.A taxonomy for the crossover operator for real-coded genetic algorithms: an experimental study200318330933810.1002/int.100912-s2.0-0037338399ČrepinšekM.LiuS.-H.MernikM.Exploration and exploitation in evolutionary algorithms: a survey201345133BeyerH.-G.DebK.On self-adaptive features in real-parameter evolutionary algorithms20015325027010.1109/4235.9303142-s2.0-0035364522LloydS.Least squares quantization in pcm198228212913710.1109/tit.1982.10564892-s2.0-0020102027NakaneT.LuX.ZhangC.SHX: search history driven crossover for real-coded genetic algorithm2020https://arxiv.org/abs/2003.13508EshelmanL. J.SchafferJ. D.Real-coded genetic algorithms and interval-schemata19932Amsterdam, NetherlandsElsevier187202EshelmanL. J.MathiasK. E.SchafferJ. D.Crossover operator biases: exploiting the population distributionProceedings of the 7th International Conference on Genetic Algorithms (ICGA)July 1997East Lansing, MI, USADebK.AgrawalR. B.Simulated binary crossover for continuous search space19959115148OnoI.KobayashiS.A real-coded genetic algorithm for function optimization using unimodal normal distribution crossoverProceedings of the Seventh International Conference on Genetic Algorithms (ICGA)1997East Lansing, MI, USA246253TsutsuiS.YamamuraM.HiguchiT.Multi-parent recombination with simplex crossover in real coded genetic algorithmsProceedings of the Genetic and Evolutionary Computation Conference (GECCO)July 1999Orlando, FL, USA657664SanoY.KitaH.Optimization of noisy fitness functions by means of genetic algorithms using history of searchProceedings of the International Conference on Parallel Problem Solving from Nature (PPSN)September 2000Paris, FranceSpringer571580AmorH. B.RettingerA.Intelligent exploration for genetic algorithms: using self-organizing maps in evolutionary computationProceedings of the 2005 Conference on Genetic and Evolutionary Computation—GECCO’05June 2005Washington, DC, USA15311538YuenS. Y.ChowC. K.Continuous non-revisiting genetic algorithmProceedings of the 2009 IEEE Congress on Evolutionary ComputationMay 2009Trondheim, NorwayIEEE1896190310.1109/CEC.2009.49831722-s2.0-70450060067KitaH.A comparison study of self-adaptation in evolution strategies and real-coded genetic algorithms20019222324110.1162/1063656017501904152-s2.0-0035375260AkimotoY.HasadaR.SakumaJ.OnoI.KobayashiS.Generation alternation model for real-coded ga using multi-parent: proposal and evaluation of just generation gap (JGG)Proceedings of the 19th SICE Symposium on Decentralized Autonomous SystemsJanuary 2007Tokyo, Japan341346KobayashiS.The frontiers of real-coded genetic algorithms200924114716210.1527/tjsai.24.1472-s2.0-59349110951GoldbergD. E.Genetic algorithms in search, optimization, and machine learning1989WangJ.ZhangM.ErsoyO. K.SunK.BiY.An improved real-coded genetic algorithm using the heuristical normal distribution and direction-based crossover2019201917424385310.1155/2019/4243853HaqE.-u.AhmadI.HussainA.AlmanjahieI. M.A novel selection approach for genetic algorithms for global optimization of multimodal continuous functions2019201914864021810.1155/2019/8640218JamilM.YangX. S.A literature survey of benchmark functions for global optimisation problems20134215019410.1504/ijmmno.2013.0552042-s2.0-84890464528FisterI.FongS.BrestJ.A novel hybrid self-adaptive bat algorithm20142014270973810.1155/2014/7097382-s2.0-84900032761Ab WahabM. N.Nefti-MezianiS.AtyabiA.A comprehensive review of swarm optimization algorithms2015105e012282710.1371/journal.pone.01228272-s2.0-84930681977OnoI.KitaH.KobayashiS.A real-coded genetic algorithm using the unimodal normal distribution crossover2003Berlin, GermanySpringer213237KitaH.OnoI.KobayashiS.Theoretical analysis of the unimodal normal distribution crossover for real-coded genetic algorithmsProceedings of the 1998 IEEE International Conference on Evolutionary Computation ProceedingsMay 1998Anchorage, AK, USA52953410.1109/icec.1998.700084