A novel robust hybrid metaheuristic optimization approach, which can be considered as an improvement of the recently developed bat algorithm, is proposed to solve global numerical optimization problems. The improvement includes the addition of pitch adjustment operation in HS serving as a mutation operator during the process of the bat updating with the aim of speeding up convergence, thus making the approach more feasible for a wider range of real-world applications. The detailed implementation procedure for this improved metaheuristic method is also described. Fourteen standard benchmark functions are applied to verify the effects of these improvements, and it is demonstrated that, in most situations, the performance of this hybrid metaheuristic method (HS/BA) is superior to, or at least highly competitive with, the standard BA and other population-based optimization methods, such as ACO, BA, BBO, DE, ES, GA, HS, PSO, and SGA. The effect of the HS/BA parameters is also analyzed.
The process of optimization is searching a vector in a function that produces an optimal solution. All of feasible values are available solutions, and the extreme value is optimal solution. In general, optimization algorithms are applied to solve these optimization problems. A simple classification way for optimization algorithms is considering the nature of the algorithms, and optimization algorithms can be divided into two main categories: deterministic algorithms and stochastic algorithms. Deterministic algorithms using gradients such as hill climbing have a rigorous move and will generate the same set of solutions if the iterations commence with the same initial starting point. On the other hand, stochastic algorithms without using gradients often generate different solutions even with the same initial value. However, generally speaking, the final values, though slightly different, will converge to the same optimal solutions within a given accuracy [
Inspired by nature, these strong metaheuristic algorithms are applied to solve NP-hard problems, such as UCAV path planning [
Firstly presented by Yang in 2010, the bat-inspired algorithm or bat algorithm (BA) [
Firstly proposed by Geem et al. in 2001, harmony search (HS) [
BA is a powerful algorithm in exploitation (i.e., local search), but at times it may trap into some local optima, so that it cannot perform global search well. For bat algorithm, the search depends completely on random walks, so a fast convergence cannot be guaranteed. Firstly presented here, in order to increase the diversity of the population for BA so as to avoid trapping into local optima, a main improvement of adding pitch adjustment operation in HS serving as a mutation operator is made to the BA with the aim of speeding up convergence, thus making the approach more feasible for a wider range of practical applications while preserving the attractive characteristics of the basic BA. That is to say, we combine two approaches to propose a new hybrid metaheuristic algorithm according to the principle of HS and BA, and then this improved BA method is used to search the optimal objective function value. The proposed approach is evaluated on fourteen standard benchmark functions that have ever been applied to verify optimization algorithms on continuous optimization problems. Experimental results show that the HS/BA performs more efficiently and accurately than basic BA, ACO, BBO, DE, ES, GA, HS, PSO, and SGA.
The structure of this paper is organized as follows: Section
To begin with, in this section we will provide a brief background on the optimization problem, harmony search (HS), and bat algorithm (BA).
In computer science, mathematics, management science, and control theory, optimization (also called mathematical optimization or mathematical programming) means the selection of an optimal solution from some set of feasible alternatives. In general, an optimization problem includes minimizing or maximizing a function by systematically selecting input values from a given feasible set and calculating the value of the function. More generally, optimization consists of finding the optimal values of some objective function within a given domain, including a number of different types of domains and different types of objective functions [
A global optimization problem can be described as follows. Given: a function Sought: a parameter
Such a formulation is named a numerical optimization problem. Many theoretical and practical problems may be modeled in this general framework. In general,
Conventionally, the standard formulation of an optimization problem is stated in terms of minimization. In general, unless both the feasible region and the objective function are convex in a minimization problem, there may be more than one local minima. A local minimum
The branch of numerical analysis and applied mathematics that investigates deterministic algorithms that can guarantee convergence in limited time to the true optimal solution of a nonconvex problem is called global numerical optimization problems. A variety of algorithms have been proposed to solve nonconvex problems. Among them, heuristics algorithms can evaluate approximate solutions to some optimization problems, as described in introduction [
Firstly developed by Geem et al. in 2001, harmony search (HS) [
In more detail, we can explain the HS algorithm with the help of discussing the improvisation process by a music player. When a player is improvising, he or she has three possible options: (1) play any well-known piece of music (a series of pitches in harmony) exactly from his or her memory as the harmony memory consideration rate (HMCR); (2) play something similar to a known piece in player’s memory (thus adjusting the pitch slightly); or (3) play totally new or random pitch from feasible ranges. If these three options are formalized for optimization, we have three corresponding components: employment of harmony memory, pitch adjusting, and randomization. Similarly, when each decision variable selects one value in the HS algorithm, it can make use of one of the above three rules in the whole HS procedure. If a new harmony vector is better than the worst harmony vector in the HM, the new harmony vector takes the place of the worst harmony vector in the HM. This procedure is repeated until a stopping criterion is satisfied.
The employment of harmony memory is significant, as it is analogous to select the optimal fit individuals in the GA (genetic algorithm). This will make sure that the best harmonies will be kept on to the new harmony memory. For the purpose of using this memory more effectively, we should properly set the value of the parameter
To adjust the pitch slightly in the second component, an appropriate approach is to be applied to adjust the frequency efficiently. In theory, we can adjust the pitch linearly or nonlinearly, but in fact, linear adjustment is utilized. If
Pitch adjustment is similar to the mutation operator in GA. Also, we must appropriately set the parameter
For the purpose of increasing the diversity of the solutions, the randomization is needed in the third component. Although adjusting pitch has a similar role, it is confined to certain local pitch adjustment and thus corresponds to a local search. The usage of randomization can make the system move further to explore multifarious regions with high solution diversity in order to search for the global optimal solution. In real-world engineering applications, HS has been applied to solve many optimization problems including function optimization, water distribution network, groundwater modeling, energy-saving dispatch, structural design, and vehicle routing.
The mainframe of the basic HS algorithm can be described as shown in Algorithm
if rand < PAR then // pitch adjustment
Update the HM as Update the best harmony vector found so far
The bat algorithm is a novel metaheuristic swarm intelligence optimization method developed for the global numerical optimization, in which the search algorithm is inspired by social behavior of bats and the phenomenon of echolocation to sense distance.
In [ all bats apply echolocation to sense distance, and they always “know” the surroundings in some magical way; bats fly randomly with velocity although the loudness can change in different ways, it is supposed that the loudness varies from a minimum constant (positive)
Based on these approximations and idealization, the basic steps of the bat algorithm (BA) can be described as shown in Algorithm
randomly and each bat corresponding to a potential solution to the given problem; define loudness and the initial velocities Generate new solutions by adjusting frequency, and updating velocities and locations/solutions [( Select a solution among the best solutions; Generate a local solution around the selected best solution Generate a new solution by flying randomly Accept the new solutions Increase
In BA, each bat is defined by its position
For the local search part, once a solution is selected among the current best solutions, a new solution for each bat is generated locally using random walk
The update of the velocities and positions of bats is similar to the procedure in the standard PSO [
Furthermore, the loudness
Based on the introduction of HS and BA in previous section, we will explain how we combine the two approaches to form the proposed bat algorithm with harmony search (HS/BA) in this section, which modifies the solutions with poor fitness in order to add diversity of the population to improve the search efficiency.
In general, the standard BA algorithm is adept at exploiting the search space, but at times it may trap into some local optima, so that it cannot perform global search well. For BA, the search depends completely on random walks, so a fast convergence cannot be guaranteed. Firstly presented here, in order to increase the diversity of the population for BA so as to avoid trapping into local optima, a main improvement of adding pitch adjustment operation in HS serving as a mutation operator is made to the BA with the aim of speeding up convergence, thus making the approach more feasible for a wider range of practical applications while preserving the attractive characteristics of the basic BA. In this paper, a hybrid metaheuristic algorithm by inducing the pitch adjustment operation in HS as a mutation operator into bat algorithm, so-called harmony search/bat algorithm (HS/BA), is used to optimize the benchmark functions. The difference between HS/BA and BA is that the mutation operator is used to improve the original BA generating a new solution for each bat. In this way, this method can explore the new search space by the mutation of the HS algorithm and exploit the population information with BA, and therefore can avoid trapping into local optima in BA. In the following, we will show the algorithm HS/BA which is an improvement of HS and BA.
The critical operator of HS/BA is the hybrid harmony search mutation operator, which composes the improvisation of harmony in HS with the BA. The core idea of the proposed hybrid mutation operator is based on two considerations. Firstly, poor solutions can take in many new used features from good solutions. Secondly, the mutation operator can improve the exploration of the new search space. In this way, the strong exploration abilities of the original HS and the exploitation abilities of the BA can be fully developed.
For bat algorithm, as the search relies entirely on random walks, a fast convergence cannot be guaranteed. Described here for the first time, a main improvement of adding mutation operator is made to the BA, including three minor improvements, which are made with the aim of speeding up convergence, thus making the method more practical for a wider range of applications, but without losing the attractive features of the original method.
The first improvement is that we use fixed frequency
The second improvement is to add mutation operator in an attempt to increase diversity of the population to improve the search efficiency and speed up the convergence to optima. For the local search part, once a solution is selected among the current best solutions, a new solution for each bat is generated locally using random walk by (
The last improvement is the addition of elitism scheme into the HS/BA. As with other population-based optimization algorithms, we typically incorporate some sort of elitism in order to retain the best solutions in the population. This prevents the best solutions from being corrupted by pitch adjustment operator. Note that we use an elitism approach to save the property of the bats that has the best solution in the HS/BA process, so even if pitch adjustment operation ruins its corresponding bat, we have saved it and can revert back to it if needed.
Based on the above-mentioned analyses, the mainframe of the harmony search/bat algorithm (HS/BA) is presented in Algorithm
bats solution to the given problem; define loudness the initial velocities rate HMCR, the pitch adjustment rate PAR and bandwidth bw; set maximum of elite individuals retained KEEP. Sort the population of bats Store the KEEP best bats as KEEPBAT. Evaluate the fitness for the offsprings Select the offspring Replace the KEEP worst bats with the KEEP best bats KEEPBAT stored.
In this section, we test the performance of the proposed metaheuristic HS/BA to global numerical optimization through a series of experiments conducted on benchmark functions.
To allow a fair comparison of running times, all the experiments were conducted on a PC with a Pentium IV processor running at 2.0 GHz, 512 MB of RAM and a hard drive of 160 Gbytes. Our implementation was compiled using MATLAB R2012a (7.14) running under Windows XP3. No commercial BA or HS tools were used in the following experiments.
In order to explore the benefits of HS/BA, in this subsection we compared its performance on global numeric optimization problem with nine other population-based optimization methods, which are ACO, BA, BBO, DE, ES, GA, HS, PSO, and SGA. ACO (ant colony optimization) [
In all experiments, we will use the same parameters for HS, BA and HS/BA that are loudness
Well-defined problem sets are favorable for evaluating the performance of optimization methods proposed in this paper. Based on mathematical functions, benchmark functions can be applied as objective functions to perform such tests. The properties of these benchmark functions can be easily achieved from their definitions. Fourteen different benchmark functions are applied to verify our proposed metaheuristic algorithm HS/BA. Each of the functions in this study has 20 independent variables (i.e.,
The benchmark functions described in Table
Benchmark functions.
No. | Name | Definition | Source |
---|---|---|---|
F01 | Ackley |
|
[ |
F02 | Fletcher-Powell |
|
[ |
F03 | Griewank |
|
[ |
F04 | Penalty #1 |
|
[ |
F05 | Penalty #2 |
|
[ |
F06 | Quartic |
|
[ |
F07 | Rastrigin |
|
[ |
F08 | Rosenbrock |
|
[ |
F09 | Schwefel 2.26 |
|
[ |
F10 | Schwefel 1.2 |
|
[ |
F11 | Schwefel 2.22 |
|
[ |
F12 | Schwefel 2.21 |
|
[ |
F13 | Sphere |
|
[ |
F14 | Step |
|
[ |
*In benchmark function F02, the matrix elements
*In benchmark functions F04 and F05, the definition of the function
Properties of benchmark functions; lb denotes lower bound, ub denotes upper bound, and opt denotes optimum point.
No. | Function | lb | ub | opt | Continuity | Modality |
---|---|---|---|---|---|---|
F01 | Ackley | −32.768 | 32.768 | 0 | Continuous | Multimodal |
F02 | Fletcher-Powell |
|
|
0 | Continuous | Multimodal |
F03 | Griewank | −600 | 600 | 0 | Continuous | Multimodal |
F04 | Penalty #1 | −50 | 50 | 0 | Continuous | Multimodal |
F05 | Penalty #2 | −50 | 50 | 0 | Continuous | Multimodal |
F06 | Quartic |
−1.28 | 1.28 | 1 | Continuous | Multimodal |
F07 | Rastrigin | −5.12 | 5.12 | 0 | Continuous | Multimodal |
F08 | Rosenbrock | −2.048 | 2.048 | 0 | Continuous | Unimodal |
F09 | Schwefel 2.26 | −512 | 512 | 0 | Continuous | Multimodal |
F10 | Schwefel 1.2 | −100 | 100 | 0 | Continuous | Unimodal |
F11 | Schwefel 2.22 | −10 | 10 | 0 | Continuous | Unimodal |
F12 | Schwefel 2.21 | −100 | 100 | 0 | Continuous | Unimodal |
F13 | Sphere | −5.12 | 5.12 | 0 | Continuous | Unimodal |
F14 | Step | −5.12 | 5.12 | 0 | Discontinuous | Unimodal |
We set population size
Mean normalized optimization results in fourteen benchmark functions. The values shown are the minimum objective function values found by each algorithm, averaged over 100 Monte Carlo simulations.
ACO | BA | BBO | D |
|
GA | HS | HSBA | PSO | SGA | |
---|---|---|---|---|---|---|---|---|---|---|
F01 | 2.31 | 3.33 | 1.15 | 2.02 | 3.38 | 2.72 | 3.47 | 1.09 | 2.66 |
|
F02 | 24.58 | 25.82 | 1.58 | 8.94 | 24.35 | 5.45 | 15.69 |
|
13.96 | 1.33 |
F03 | 3.16 | 60.72 | 1.93 | 5.44 | 23.85 | 3.22 | 77.22 |
|
25.77 | 1.42 |
F04 |
|
3.0 |
4.0 |
5.6 |
2.7 |
3.1 |
1.4 |
2.3 |
4.1 |
9.6 |
F05 |
|
1.1 |
299.42 | 1.5 |
4.6 |
5.4 |
4.1 |
215.51 | 5.5 |
111.10 |
F06 | 489.01 | 6.8 |
35.32 | 308.29 | 1.8 |
274.83 | 1.5 |
|
2.5 |
10.09 |
F07 | 8.09 | 11.55 | 1.28 | 6.56 | 11.87 | 6.17 | 10.22 |
|
8.44 | 1.80 |
F08 | 42.25 | 29.01 | 2.29 | 7.59 | 59.99 | 9.05 | 47.85 |
|
12.04 | 2.15 |
F09 | 3.17 | 20.26 | 1.99 | 13.58 | 13.33 | 1.81 | 19.92 |
|
17.61 | 1.15 |
F10 | 1.75 | 3.73 | 1.38 | 2.95 | 4.93 | 1.25 | 4.22 |
|
2.48 | 1.48 |
F11 | 1.05 | 19.70 | 1.83 | 7.14 | 23.12 | 11.13 | 19.45 |
|
13.22 | 2.46 |
F12 | 1.86 | 4.03 |
|
2.99 | 3.91 | 1.92 | 3.74 | 1.38 | 2.38 | 1.12 |
F13 | 98.30 | 150.84 | 3.80 | 19.03 | 226.52 | 47.74 | 182.32 |
|
72.91 | 4.02 |
F14 | 7.73 | 120.48 | 3.93 | 13.31 | 102.56 | 11.53 | 146.55 |
|
63.44 | 3.28 |
Tim |
2.74 |
|
1.32 | 1.64 | 1.67 | 1.79 | 2.33 | 1.43 | 2.03 | 1.76 |
*The values are normalized so that the minimum in each row is 1.00. These are not the absolute minima found by each algorithm, but the average minima found by each algorithm.
Best normalized optimization results in fourteen benchmark functions. The values shown are the minimum objective function values found by each algorithm.
ACO | BA | BBO | D |
|
GA | HS | HSBA | PSO | SGA | |
---|---|---|---|---|---|---|---|---|---|---|
F01 | 1.85 | 2.31 |
|
1.43 | 2.25 | 2.03 | 2.30 | 1.05 | 1.91 | 1.04 |
F02 | 10.42 | 14.48 | 1.09 | 4.22 | 10.39 | 4.03 | 9.91 |
|
8.28 | 1.33 |
F03 | 2.73 | 46.22 | 1.87 | 4.47 | 21.38 | 8.02 | 43.24 |
|
16.75 | 1.76 |
F04 | 4.4 |
5.0 |
1.2 |
1.9 |
2.2 |
3.0 |
4.2 |
|
3.755 | 3.22 |
F05 | 3.0 |
3.2 |
51.01 | 386.33 | 1.6 |
884.46 | 2.6 |
|
4.113 | 4.57 |
F06 | 58.51 | 992.55 | 6.50 | 24.69 | 808.48 | 59.24 | 746.91 |
|
189.71 | 2.02 |
F07 | 5.71 | 8.53 | 1.25 | 5.13 | 7.94 | 5.23 | 7.47 |
|
6.00 | 1.68 |
F08 | 26.42 | 25.03 | 1.48 | 3.70 | 33.52 | 6.16 | 21.50 |
|
7.75 | 1.45 |
F09 | 2.43 | 8.66 | 1.28 | 4.90 | 5.93 | 2.09 | 7.28 |
|
7.40 | 1.42 |
F10 | 1.89 | 4.94 | 1.18 | 2.66 | 3.00 | 2.08 | 2.76 |
|
2.01 | 1.74 |
F11 | 7.74 | 12.61 | 1.21 | 3.36 | 12.14 | 6.01 | 9.60 |
|
7.81 | 1.62 |
F12 | 1.33 | 2.33 | 1.44 | 1.69 | 2.08 | 1.74 | 2.11 |
|
1.70 | 1.21 |
F13 | 28.04 | 56.57 | 2.17 | 5.54 | 60.57 | 19.71 | 52.86 |
|
20.98 | 2.28 |
F14 | 4.28 | 54.02 | 1.97 | 4.85 | 33.61 | 10.60 | 49.11 |
|
19.72 | 1.85 |
*The values are normalized so that the minimum in each row is 1.00. These are the absolute best minima found by each algorithm.
From Table
Furthermore, the computational requirements of the ten optimization methods were similar. We collected the average computational time of the optimization methods as applied to the 14 benchmarks discussed in this section. The results are shown in Table
Furthermore, convergence graphs of ACO, BA, BBO, DE, ES, GA, HS, HS/BA, PSO, and SGA are shown in Figures
Comparison of the performance of the different methods for the F01 Ackley function.
Comparison of the performance of the different methods for the F02 Fletcher-Powell function.
Comparison of the performance of the different methods for the F03 Griewank function.
Comparison of the performance of the different methods for the F04 Penalty #1 function.
Comparison of the performance of the different methods for the F05 Penalty #2 function.
Comparison of the performance of the different methods for the F06 Quartic (
Comparison of the performance of the different methods for the F07 Rastrigin function.
Comparison of the performance of the different methods for the F08 Rosenbrock function.
Comparison of the performance of the different methods for the F09 Schwefel 2.26 function.
Comparison of the performance of the different methods for the F10 Schwefel 1.2 function.
Comparison of the performance of the different methods for the F11 Schwefel 2.22 function.
Comparison of the performance of the different methods for the F12 Schwefel 2.21 function.
Comparison of the performance of the different methods for the F13 Sphere function.
Comparison of the performance of the different methods for the F14 Step function.
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
From the above-analyses about Figures
In [
Best normalized optimization results in fourteen benchmark functions with different
| |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
0 | 0.1 | 0.2 | 0.3 | 0.4 | 0.5 | 0.6 | 0.7 | 0.8 | 0.9 | 1.0 | |
F01 | 1.66 | 1.33 | 1.55 | 1.40 |
|
1.33 | 1.21 | 1.22 | 1.16 | 1.13 | 1.14 |
F02 | 1.25 |
|
|
|
|
|
|
|
|
|
|
F03 | 3.41 | 11.15 |
|
|
|
|
|
|
|
|
|
F04 | 2.73 |
|
1.25 |
1.04 | 1.04 | 1.04 | 1.04 | 1.04 | 1.04 |
|
|
F05 | 5.03 |
5.30 | 1.33 | 1.68 |
|
|
|
|
|
|
|
F06 |
|
2.56 | 5.47 | 4.73 | 17.81 | 9.90 | 2.89 | 6.74 | 9.90 | 2.60 | 5.57 |
F07 | 10.14 | 14.92 | 4.35 | 1.10 | 1.10 | 15.37 | 1.01 |
|
|
|
|
F08 | 38.49 | 1.08 | 14.06 | 1.08 | 1.08 | 1.08 | 99.01 | 1.01 |
|
|
|
F09 | 285.18 | 404.58 | 1.38 | 4.74 | 1.19 | 1.19 | 1.19 | 367.91 | 1.01 |
|
|
F10 | 665.22 | 4.68 | 1.69 |
15.27 | 1.18 | 1.18 | 1.18 | 1.18 | 1.21 |
|
|
F11 | 20.03 |
|
9.96 | 11.54 | 39.22 | 9.96 | 9.96 | 9.96 | 9.96 | 38.97 | 8.49 |
F12 | 10.32 |
|
13.22 | 3.23 | 14.69 | 2.79 | 2.79 | 2.79 | 2.79 | 2.79 | 19.31 |
F13 | 1.46 | 4.15 | 2.84 | 4.47 | 1.15 | 1.56 |
|
|
|
|
|
F14 | 527.52 | 13.57 | 3.95 | 1.01 | 1.15 | 12.91 |
|
|
|
|
|
| |||||||||||
1 | 4 | 2 | 2 | 3 | 3 | 5 | 6 | 7 |
|
|
Mean normalized optimization results in fourteen benchmark functions with different
| |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
0 | 0.1 | 0.2 | 0.3 | 0.4 | 0.5 | 0.6 | 0.7 | 0.8 | 0.9 | 1.0 | |
F01 | 1.01 | 1.01 |
|
|
|
|
|
|
|
|
|
F02 | 1.52 |
|
1.46 | 1.17 | 1.08 | 1.37 | 1.88 | 1.65 | 1.67 | 1.79 | 1.48 |
F03 | 6.69 | 10.41 | 1.06 | 1.01 | 1.03 | 1.03 | 1.01 | 1.02 | 1.05 |
|
|
F04 | 201.31 | 1.26 | 4.10 | 4.21 | 3.33 | 2.72 |
|
5.19 | 2.91 |
|
3.11 |
F05 | 591.22 | 16.82 | 4.25 | 5.87 | 1.01 | 4.14 | 24.04 |
|
3.68 | 7.33 | 9.00 |
F06 |
|
4.90 |
637.99 | 258.71 | 4.95 |
2.18 | 2.17 | 2.17 | 2.18 | 2.17 | 2.17 |
F07 | 8.57 | 2.21 |
6.43 | 272.34 | 110.48 | 2.11 |
1.02 | 1.01 |
|
|
|
F08 | 49.80 | 1.14 |
1.72 |
161.60 | 224.59 | 91.19 | 1.74 |
1.03 | 1.11 |
|
|
F09 | 82.37 | 1.73 |
1.06 | 3.14 | 74.45 | 103.10 | 42.26 | 7.94 |
|
1.37 | 1.29 |
F10 | 90.31 | 1.45 |
2.45 |
2.32 |
23.34 | 22.34 | 31.38 | 12.89 | 2.34 |
1.40 |
|
F11 | 4.98 |
|
3.15 |
2.33 | 14.41 | 520.23 | 443.65 | 616.26 | 249.91 | 4.78 |
2.14 |
F12 | 3.69 | 2.10 |
4.57 |
|
2.12 |
266.35 | 233.13 | 198.80 | 276.14 | 112.01 | 2.14 |
F13 | 1.90 | 8.25 | 4.48 |
2.14 |
|
6.15 | 254.51 | 222.75 | 189.96 | 263.86 | 107.01 |
F14 | 66.91 | 1.95 |
1.17 |
1.24 |
21.04 | 1.86 |
29.40 | 23.92 |
|
18.35 | 24.75 |
| |||||||||||
1 | 2 | 1 | 2 | 2 | 1 | 2 | 2 | 4 | 5 | 5 |
Best normalized optimization results in fourteen benchmark functions with different
| |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
0 | 0.1 | 0.2 | 0.3 | 0.4 | 0.5 | 0.6 | 0.7 | 0.8 | 0.9 | 1.0 | |
F01 | 1.84 | 1.84 | 1.84 | 4.84 | 1.59 | 1.71 |
|
1.34 | 1.09 | 1.16 | 1.01 |
F02 | 9.44 |
|
|
|
|
|
|
|
|
|
|
F03 | 3.73 | 5.46 | 1.86 | 1.86 | 1.86 | 1.86 |
|
1.72 | 1.21 | 1.61 | 1.01 |
F04 | 11.42 | 1.11 | 24.29 | 1.23 | 1.26 |
|
1.29 | 1.29 | 1.29 | 1.29 | 1.29 |
F05 | 1.77 |
2.30 | 1.15 | 1.09 |
|
|
|
|
|
|
|
F06 | 62.44 | 29.96 | 48.58 | 15.60 | 26.34 | 7.80 | 1.82 |
|
35.20 | 2.39 | 1.55 |
F07 | 8.31 | 4.48 | 2.58 | 1.29 | 1.29 | 3.49 |
|
|
|
|
|
F08 | 9.61 | 1.18 | 4.20 | 1.37 | 1.37 | 1.37 | 6.95 | 1.01 | 1.01 |
|
|
F09 | 373.94 | 444.22 | 1.68 | 3.35 | 1.68 | 1.68 |
|
291.98 | 1.01 | 1.01 | 1.01 |
F10 | 386.93 | 3.16 | 13.97 | 4.63 | 1.58 | 1.58 |
|
1.58 | 309.31 | 1.58 | 1.01 |
F11 | 42.55 |
|
18.29 | 21.35 | 20.18 | 11.11 | 19.04 | 21.35 | 15.81 | 18.76 | 11.55 |
F12 | 114.40 |
|
141.84 | 44.48 | 125.47 | 44.48 | 44.48 | 44.48 | 44.48 | 44.48 | 69.43 |
F13 | 6.39 | 6.37 | 6.01 | 2.96 | 3.02 |
|
1.52 | 1.39 |
|
2.57 | 2.49 |
F14 | 120.52 | 4.04 | 2.33 |
|
1.16 | 3.40 |
|
|
1.16 | 1.16 | 1.16 |
| |||||||||||
0 | 3 | 1 | 2 | 2 | 4 |
|
5 | 4 | 4 | 4 |
Mean normalized optimization results in fourteen benchmark functions with different
| |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
0 | 0.1 | 0.2 | 0.3 | 0.4 | 0.5 | 0.6 | 0.7 | 0.8 | 0.9 | 1.0 | |
F01 | 1.84 | 1.99 | 1.86 | 1.94 | 1.59 |
|
1.42 | 1.34 | 1.09 | 1.16 | 1.71 |
F02 | 3.31 | 5.38 | 4.57 | 3.08 | 3.82 | 1.14 |
|
2.89 | 2.79 | 4.02 | 3.11 |
F03 | 3.73 | 5.46 | 5.33 | 2.29 | 3.36 | 2.41 |
|
1.72 | 1.21 | 1.61 | 1.36 |
F04 | 11.42 | 1.11 | 24.29 | 1.49 |
1.13 |
3.94 |
2.78 |
19.66 | 175.22 | 1.14 |
|
F05 | 16.09 | 128.62 | 32.69 | 24.46 | 64.40 | 12.77 | 29.59 | 76.17 | 4.56 |
|
4.46 |
F06 | 62.44 | 26.96 | 48.58 | 15.60 | 26.34 | 7.80 | 1.82 |
|
35.20 | 2.39 | 1.55 |
F07 | 3.30 | 1.78 | 1.34 | 1.03 | 1.23 | 1.39 | 1.55 |
|
1.19 | 1.36 | 3.49 |
F08 | 4.88 | 3.93 | 3.94 | 1.94 | 2.19 | 3.83 | 3.53 | 4.33 |
|
2.27 | 3.14 |
F09 | 1.39 | 1.66 | 1.14 |
|
1.21 | 1.15 | 1.15 | 1.09 | 1.08 | 1.04 | 1.09 |
F10 | 7.21 | 9.94 | 8.75 | 3.60 | 8.46 | 6.11 | 4.83 | 4.14 | 5.76 |
|
2.71 |
F11 | 3.82 | 4.23 | 3.73 | 2.05 | 1.81 |
|
1.71 | 2.20 | 1.42 | 1.68 | 1.89 |
F12 | 1.64 | 2.17 | 2.04 | 1.47 | 1.80 | 1.86 |
|
1.21 | 1.13 | 1.34 | 1.56 |
F13 | 6.39 | 6.37 | 6.01 | 2.96 | 3.02 | 1.90 | 1.52 | 1.39 |
|
2.57 | 2.49 |
F14 | 5.74 | 15.43 | 7.42 | 6.82 | 6.44 | 3.17 |
|
4.85 | 2.21 | 1.74 | 2.09 |
| |||||||||||
0 | 0 | 0 | 1 | 0 | 2 |
|
2 | 2 | 2 | 1 |
Best normalized optimization results in fourteen benchmark functions with different HMCR. The numbers shown are the best results found after 100 Monte Carlo simulations of HS/BA algorithm.
HMCR | |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
0 | 0.1 | 0.2 | 0.3 | 0.4 | 0.5 | 0.6 | 0.7 | 0.8 | 0.9 | 1.0 | |
F01 | 1.64 | 1.64 | 1.64 | 1.64 | 1.64 | 1.64 | 1.64 | 1.51 | 1.28 |
|
1.63 |
F02 | 1.87 |
|
|
|
|
|
|
|
|
|
|
F03 | 19.82 | 26.49 | 3.71 | 3.71 | 3.71 | 3.71 | 3.71 | 3.71 | 3.71 | 2.06 |
|
F04 | 1.26 |
|
1.87 |
|
|
|
|
|
|
|
|
F05 | 6.07 |
5.34 |
|
1.87 |
|
|
|
|
|
|
|
F06 | 108.92 | 246.31 | 365.04 | 345.40 | 338.57 | 234.65 | 143.49 | 45.28 | 23.30 |
|
25.75 |
F07 | 9.22 | 14.28 | 5.34 |
|
|
9.30 |
|
|
|
|
|
F08 | 18.10 | 1.08 | 7.72 | 1.08 | 1.08 | 1.08 | 35.94 |
|
|
|
|
F09 | 280.04 | 391.61 | 1.27 | 6.81 | 1.27 | 1.27 | 1.27 | 227.95 |
|
|
|
F10 | 551.65 | 8.76 | 1.76 |
11.70 | 1.64 | 1.64 | 1.64 | 1.64 | 274.48 |
|
|
F11 | 14.67 |
|
6.18 | 6.18 | 13.99 | 6.18 | 6.18 | 6.18 | 4.40 | 2.69 | 6.16 |
F12 | 7.68 |
|
11.10 | 2.73 | 10.21 | 2.73 | 2.73 | 2.73 | 2.73 | 2.73 | 4.73 |
F13 | 7.72 | 26.75 | 15.90 | 17.76 | 6.12 | 10.40 | 6.12 | 5.95 | 2.55 |
|
2.25 |
F14 | 537.16 | 14.28 | 5.34 |
|
|
7.13 |
|
|
|
|
|
| |||||||||||
0 | 4 | 2 | 4 | 5 | 3 | 5 | 6 | 7 |
|
|
Mean normalized optimization results in fourteen benchmark functions with different HMCR. The numbers shown are the best results found after 100 Monte Carlo simulations of HS/BA algorithm.
HMCR | |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
0 | 0.1 | 0.2 | 0.3 | 0.4 | 0.5 | 0.6 | 0.7 | 0.8 | 0.9 | 1.0 | |
F01 | 1.01 | 1.01 | 1.01 | 1.01 | 1.01 | 1.01 | 1.01 | 1.01 | 1.01 |
|
1.01 |
F02 | 747.56 | 7.87 | 5.25 | 3.43 | 5.09 | 7.10 | 3.07 | 1.87 | 1.60 |
|
1.01 |
F03 | 9.13 | 3.12 |
1.09 | 1.07 | 1.05 | 1.06 | 1.05 | 1.02 | 1.01 | 1.01 |
|
F04 |
|
|
|
|
|
|
|
|
46.94 |
|
2.37 |
F05 |
|
|
|
|
|
|
|
|
629.48 |
|
203.73 |
F06 |
|
|
422.04 | 632.15 |
|
1.92 | 1.91 | 1.91 | 1.91 | 1.91 | 1.91 |
F07 | 10.63 |
|
9.00 | 216.72 | 324.55 | 3.07 |
1.04 | 1.03 | 1.02 | 1.01 |
|
F08 | 59.50 |
|
3.04 |
141.80 | 217.01 | 324.68 | 3.07 |
1.07 | 1.07 | 1.03 |
|
F09 | 226.87 |
|
3.08 | 8.91 | 114.22 | 173.60 | 259.41 | 2.45 |
1.45 |
|
1.71 |
F10 | 257.37 |
|
9.35 |
1.37 |
97.09 | 66.56 | 99.25 | 149.40 | 1.39 |
|
2.73 |
F11 | 6.07 |
|
1.83 |
2.02 | 16.61 | 390.40 | 262.99 | 403.00 | 603.61 | 5.72 |
1.84 |
F12 | 3.00 | 2.79 |
3.60 |
|
2.82 |
270.95 | 194.14 | 130.77 | 200.40 | 300.16 | 2.84 |
F13 | 2.32 | 9.66 | 5.61 |
1.89 |
|
8.15 | 267.78 | 191.86 | 129.23 | 198.05 | 296.65 |
F14 | 184.66 | 3.84 |
1.17 |
1.85 |
|
5.71 |
25.11 | 55.39 | 39.60 | 26.57 | 41.49 |
| |||||||||||
1 | 1 | 0 | 1 | 2 | 0 | 0 | 0 | 0 |
|
|
Best normalized optimization results in fourteen benchmark functions with different PAR. The numbers shown are the best results found after 100 Monte Carlo simulations of HS/BA algorithm.
PAR | |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
0 | 0.1 | 0.2 | 0.3 | 0.4 | 0.5 | 0.6 | 0.7 | 0.8 | 0.9 | 1.0 | |
F01 |
|
|
|
|
|
|
|
|
|
|
|
F02 | 3.91 |
|
|
|
|
|
|
|
|
|
|
F03 |
|
4.25 | 1.51 | 2.20 | 2.20 | 2.12 | 2.20 | 1.99 | 2.20 | 1.28 | 1.80 |
F04 |
|
2.55 | 65.54 | 2.55 | 2.55 | 2.55 | 2.55 | 2.55 | 2.55 | 2.55 | 2.55 |
F05 | 2.52 |
|
1.71 | 1.69 |
1.71 | 1.71 | 1.71 | 1.71 | 1.71 | 1.71 | 1.71 |
F06 |
|
59.87 | 34.18 | 22.85 | 46.52 | 23.44 | 43.54 | 44.84 | 36.88 | 18.99 | 8.12 |
F07 | 8.07 |
|
1.48 | 2.55 | 2.55 | 13.06 | 2.55 | 2.55 | 2.55 | 2.55 | 2.55 |
F08 | 5.43 |
|
1.92 |
|
|
|
11.45 |
|
|
|
|
F09 | 76.72 | 2.52 | 1.17 |
|
1.71 | 1.71 | 1.71 | 155.56 | 1.71 | 1.71 | 1.71 |
F10 | 929.10 | 1.48 |
|
4.91 | 2.55 | 2.55 | 2.55 | 2.22 | 2.35 |
2.55 | 2.55 |
F11 | 3.18 |
|
5.82 |
3.99 |
3.39 |
5.82 |
4.91 |
5.82 |
5.82 |
9.58 |
5.82 |
F12 | 305.06 |
|
537.28 | 97.34 | 187.58 | 97.34 | 97.34 | 97.34 | 97.34 | 97.34 | 398.27 |
F13 | 1.92 | 4.22 | 5.87 | 10.07 | 12.98 | 4.66 | 1.48 | 4.01 | 3.17 |
|
4.09 |
F14 | 88.12 |
|
1.48 | 2.55 | 2.55 | 4.91 | 2.55 | 2.55 | 2.55 | 2.55 | 2.55 |
| |||||||||||
4 |
|
3 | 4 | 3 | 3 | 2 | 3 | 3 | 4 | 3 |
Mean normalized optimization results in fourteen benchmark functions with different PAR. The numbers shown are the best results found after 100 Monte Carlo simulations of HS/BA algorithm.
PAR | |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
0 | 0.1 | 0.2 | 0.3 | 0.4 | 0.5 | 0.6 | 0.7 | 0.8 | 0.9 | 1.0 | |
F01 |
|
|
|
|
|
|
|
|
|
|
|
F02 | 4.07 |
|
|
4.52 | 2.77 | 2.57 | 3.41 | 5.36 | 3.69 | 4.89 | 1.03 |
F03 |
|
1.27 |
1.34 | 1.35 | 1.35 | 1.35 | 1.34 | 1.35 | 1.34 | 1.34 | 1.34 |
F04 | 1.15 | 81.27 | 9.39 |
|
1.01 | 1.01 | 129 |
|
|
1.01 | 1.01 |
F05 | 345.73 | 50.61 | 86.19 | 9.07 |
25.01 | 1.08 | 1.01 | 1.91 | 3.33 |
|
1.18 |
F06 |
|
5.42 |
4.27 |
4.74 |
5.48 |
577.88 | 577.88 | 577.88 | 577.88 | 577.87 | 577.87 |
F07 | 4.86 | 1.53 |
|
95.43 | 105.96 | 1.22 |
1.34 | 1.36 | 1.32 | 1.33 | 1.35 |
F08 | 12.69 | 76.00 | 8.76 |
17.03 | 69.13 | 76.72 | 8.85 |
1.10 | 1.05 | 1.04 |
|
F09 | 63.45 | 230.47 | 1.82 | 1.67 | 12.78 | 48.60 | 53.49 | 6.09 |
1.11 | 1.30 |
|
F10 | 133.55 | 12.51 | 1.26 | 1.97 |
11.48 | 4.64 | 16.45 | 18.93 | 1.99 |
|
1.88 |
F11 | 63.02 |
|
6.39 |
79.44 | 59.24 | 3.96 |
1.42 |
5.81 |
6.45 |
7.45 |
79.81 |
F12 | 3.90 | 8.81 |
8.90 |
|
8.90 |
44.40 | 47.78 | 17.25 | 70.11 | 77.84 | 8.99 |
F13 |
|
21.66 | 2.07 |
6.69 | 5.80 | 4.30 | 271.67 | 282.46 | 105.39 | 429.26 | 476.62 |
F14 | 46.14 |
|
36.10 | 55.93 | 1.22 | 6.40 |
42.15 | 32.89 | 34.81 | 12.72 | 51.29 |
| |||||||||||
|
|
3 | 3 | 1 | 1 | 1 | 2 | 2 | 3 | 3 |
Tables
Tables
Tables
Tables
In the HS/BA, the bats fly in the sky using echolocation to find food/prey (i.e., best solutions). Four other parameters are the loudness (
For all of the standard benchmark functions that have been considered, the HS/BA has been demonstrated to perform better than or be equal to the standard BA and other acclaimed state-of-the-art population-based algorithms with the HS/BA performing significantly better in some functions. The HS/BA performs excellently and efficiently because of its ability to simultaneously carry out a local search, still searching globally at the same time. It succeeds in doing this due to the local search via harmony search algorithm and global search via bat algorithm concurrently. A similar behavior may be performed in the PSO by using multiswarm from a particle population initially [
Benchmark evaluation is a good way for verifying the performance of the metaheuristic algorithms, but it also has limitations. First, we did not make any special effort to tune the optimization algorithms in this section. Different tuning parameter values in the optimization algorithms might result in significant differences in their performance. Second, real-world optimization problems may not have much of a relationship to benchmark functions. Third, benchmark tests might result in different conclusions if the grading criteria or problem setup change. In our work, we examined the mean and best results obtained with a certain population size and after a certain number of generations. However, we might arrive at different conclusions if (for example) we change the generation limit, or look at how many generations it takes to reach a certain function value, or if we change the population size. In spite of these caveats, the benchmark results shown here are promising for HS/BA and indicate that this novel method might be able to find a niche among the plethora of population-based optimization algorithms.
We note that CPU time is a bottleneck to the implementation of many population-based optimization algorithms. If an algorithm cannot converge fast, it will be impractical, since it would take too long to find an optimal or suboptimal solution. HS/BA does not seem to require an unreasonable amount of computational effort; of the ten optimization algorithms compared in this paper, HS/BA was the third fastest. Nevertheless, finding mechanisms to accelerate HS/BA could be an important area for further research.
This paper proposed a hybrid metaheuristic HS/BA method for optimization problem. We improved the BA by combining original harmony search (HS) algorithm and evaluating the HS/BA on multimodal numerical optimization problems. A novel type of BA model has been presented, and an improvement is applied to mutate between bats using harmony search algorithm during the process of bats updating. Using the original configuration of the bat algorithm, we generate the new harmonies based on the newly generated bat each iteration after bat’s position has been updated. The new harmony vector substitutes the newly generated bat only if it has better fitness. This selection scheme is rather greedy, which often overtakes original HS and BA. The HS/BA attempts to take merits of the BA and the HS in order to avoid all bats getting trapped in inferior local optimal regions. The HS/BA enables the bats to have more diverse exemplars to learn from, as the bats are updated each iteration and also form new harmonies to search in a larger search space. This new method can speed up the global convergence rate without losing the strong robustness of the basic BA. From the analysis of the experimental results, we observe that the proposed HS/BA makes good use of the information in past solutions more effectively to generate better quality solutions frequently, when compared to the other population-based optimization algorithms such as ACO, BA, BBO, DE, ES, GA, HS, PSO, and SGA. Based on the results of the ten approaches on the test problems, we can conclude that the HS/BA significantly improves the performances of the HS and the BA on most multimodal and unimodal problems.
In this work, 14 benchmark functions are used to evaluate the performance of our approach; we will test our approach on more problems, such as the high-dimensional (
In the field of optimization, there are many issues worthy of further study, and efficient optimization method should be developed depending on the analysis of specific engineering problem. Our future work will focus on the two issues: on the one hand, we would apply our proposed approach HS/BA to solve practical engineering optimization problems, and, obviously, HS/BA can become a fascinating method for real-world engineering optimization problems; on the other hand, we would develop new meta-hybrid approach to solve optimization problem.
This work was supported by State Key Laboratory of Laser Interaction with Material Research Fund under Grant no. SKLLIM0902-01 and Key Research Technology of Electric-discharge Non-chain Pulsed DF Laser under Grant no. LXJJ-11-Q80.