This paper describes a new variant of harmony search algorithm which is inspired by a wellknown item “elite decision making.” In the new algorithm, the good information captured in the current global best and the second best solutions can be well utilized to generate new solutions, following some probability rule. The generated new solution vector replaces the worst solution in the solution set, only if its fitness is better than that of the worst solution. The generating and updating steps and repeated until the nearoptimal solution vector is obtained. Extensive computational comparisons are carried out by employing various standard benchmark optimization problems, including continuous design variables and integer variables minimization problems from the literature. The computational results show that the proposed new algorithm is competitive in finding solutions with the stateoftheart harmony search variants.
In 2001, Geem et al. [
HS algorithm is good at identifying the high performance regions of the solution space at a reasonable time but gets into trouble in performing local search for numerical applications. In order to improve the finetuning characteristic of HS algorithm, Mahdavi et al. [
Considering that, in political science and sociology, a small minority (elite) always holds the most power in making the decisions, that is, elite decision making. One could image that the good information captured in the current elite harmonies can be well utilized to generate new harmonies. Thus, in our elite decision making HS (EDMHS) algorithm, the new harmony will be randomly generated between the best and the second best harmonies in the historic HM, following some probability rule. The generated harmony vector replaces the worst harmony in the HM, only if its fitness (measured in terms of the objective function) is better than that of the worst harmony. These generating and updating procedures repeat until the nearoptimal solution vector is obtained. To demonstrate the effectiveness and robustness of the proposed algorithm, various benchmark optimization problems, including continuous design variables and integer variables minimization problems, are used. Numerical results reveal that the proposed new algorithm is very effective.
This paper is organized as follows. In Section
In the whole paper, the optimization problem is specified as follows:
The general HS algorithm requires several parameters as follows:
HMCR, PAR, and bw are very important factors for the high efficiency of the HS methods and can be potentially useful in adjusting convergence rate of algorithms to the optimal solutions. These parameters are introduced to allow the solution to escape from local optima and to improve the global optimum prediction of the HS algorithm.
The procedure for a harmony search, which consists of Steps
Create and randomly initialize an HM with HMS. The HM matrix is initially filled with as many solution vectors as the HMS. Each component of the solution vector is generated using the uniformly distributed random number between the lower and upper bounds of the corresponding decision variable
The HM with the size of HMS can be represented by a matrix as
Improvise a new harmony from the HM or from the entire possible range. After defining the HM, the improvisation of the HM, is performed by generating a new harmony vector
Update the HM. If the new harmony is better than the worst harmony in the HM, include the new harmony into the HM and exclude the worst harmony from the HM.
Repeat Steps
To improve the performance of the HS algorithm and eliminate the drawbacks associated with fixed values of PAR and bw, Mahdavi et al. [
Numerical results reveal that the HS algorithm with variable parameters can find better solutions when compared to HS and other heuristic or deterministic methods and is a powerful search algorithm for various engineering optimization problems, see [
In 2008, Omran and Mahdavi [
In 2010, Pan et al. [
The key differences between the proposed EDMHS algorithm and IHS, GHS, and SGHS are in the way of improvising the new harmony.
The EDMHS has exactly the same steps as the IHS with the exception that Step
In this step, a new harmony vector
Many realworld applications require the variables to be integers. Methods developed for continuous variables can be used to solve such problems by rounding off the real optimum values to the nearest integers [
In EDMHS algorithm for integer programming, we generate the integer solution vector in the initial step and improvise step, that is, each component of the new harmony vector is generated according to
This section is about the performance of the EDMHS algorithm for continuous and integer variables examples. Several examples taken from the optimization literature are used to show the validity and effectiveness of the proposed algorithm. The parameters for all the algorithm are given as follows:
For the following five examples, we adopt the same variable ranges as presented in [
Consider the following:
Four HS algorithms for Rosenbrock function.
Variables  IHS  GHS  SGHS  EDMHS 


1.0000028617324386  0.9913653798835682  1.0000082201314386  0.9999992918896516 

1.0000062226347253  0.9837656861940776  1.0000169034081148  0.9999985841159521 

0.0000000000331057  0.0001667876726056  0.0000000000890147 

Consider the following:
Four HS algorithms for Goldstein and Price function I.
Variables  IHS  GHS  SGHS  EDMHS 


0.0000043109765698  −0.0108343859912985  −0.0000010647548017  −0.0000022210968748 

−0.9999978894568922  −1.0091267108154769  −1.0000037827893109  −1.0000008657021768 

3.0000000046422932  3.0447058568657721  3.0000000055974083 

Consider the following:
Four HS algorithms for Eason and Fenton's gear train inertia function.
Variables  IHS  GHS  SGHS  EDMHS 


1.7434541913586368  1.7131403370902785  1.7434648607226395  1.7434544417399731 

2.0296978640691021  2.0700437540873073  2.0296831598594332  2.0296925490097708 

1.7441520055927637  1.7447448145676987 

1.7441520055905921 
Consider the following:
Four HS algorithms for Wood function.
Variables  IHS  GHS  SGHS  EDMHS 


0.9367413185752959  0.9993702652662329  0.9917327966129160  1.0001567183702584 

0.8772781982936317  0.9987850979456709  0.9835814785067265  1.0003039053776117 

1.0596918740170123  0.9993702652662329  1.0081526992384837  0.9998357549633209 

1.1230215213184420  0.9987850979456709  1.0164353912102084  0.9996725376532794 

0.0136094062872233  0.0000602033138483  0.0002433431550602 

Consider the following:
Four HS algorithms for Powell quartic function.
Variables  IHS  GHS  SGHS  EDMHS 


−0.0383028653671760  −0.0256621703960072  0.0334641210434073  −0.0232662093056917 

0.0038093414837046  0.0023707007810820  −0.0033373644857512  0.0023226342970439 

−0.0195750968208506  −0.0199247989791340  0.0159748222727847  −0.0107227792768697 

−0.0195676609811871  −0.0199247989791340  0.0160018633328343  −0.0107574107951817 

0.0000046821615160  0.0000070109937353  0.0000024921236096 

It can be seen from Tables
Convergence of Rosenbrock function.
Convergence of Goldstein and Price function I.
Convergence of Eason and Fenton function.
Convergence of Wood function.
Convergence of Powell quartic function.
To test the performance of the proposed EDMHS algorithm more extensively, we proceed to evaluate and compare the IHS, GHS, SGHS, and EDMHS algorithms based on the following 6 benchmark optimization problems listed in CEC2005 [
Sphere function:
Schwefel problem:
Griewank function:
Rastrigin function:
Ackley’s function:
Rosenbrock’s Function:
The parameters for the IHS algorithm,
Table
AE and SD generated by the compared algorithms.
Problem  IHS  IHS  GHS  GHS  SGHS  SGHS  EDMHS  EDMHS 

AE  SD  AE  SD  AE  SD  AE  SD  
Sphere 








Schwefel 








Griewank 








Rastrigin 








Ackley 








Rosenbrock 








Six commonly used integer programming benchmark problems are chosen to investigate the performance of the EDMHS integer algorithm. For all the examples, the design variables,
Consider the following:
Consider the following:
Consider the following:
Consider the following:
Consider the following:
Consider the following:
This paper presented an EDMHS algorithm for solving continuous optimization problems and integer optimization problems. The proposed EDMHS algorithm applied a newly designed scheme to generate candidate solution so as to benefit from the good information inherent in the best and the second best solution in the historic HM.
Further work is still needed to investigate the effect of EDMHS and adopt this strategy to solve the real optimization problem.
The research is supported by the Grant from National Natural Science Foundation of China no. 11171373 and the Grant from Natural Science Foundation of Zhejiang Province no. LQ12A01024.