Structured population in evolutionary algorithms (EAs) is an important research track where an individual only interacts with its neighboring individuals in the breeding step. The main rationale behind this is to provide a high level of diversity to overcome the genetic drift. Cellular automata concepts have been embedded to the process of EA in order to provide a decentralized method in order to preserve the population structure. Harmony search (HS) is a recent EA that considers the whole individuals in the breeding step. In this paper, the cellular automata concepts are embedded into the HS algorithm to come up with a new version called cellular harmony search (cHS). In cHS, the population is arranged as a twodimensional toroidal grid, where each individual in the grid is a cell and only interacts with its neighbors. The memory consideration and population update are modified according to cellular EA theory. The experimental results using benchmark functions show that embedding the cellular automata concepts with HS processes directly affects the performance. Finally, a parameter sensitivity analysis of the cHS variation is analyzed and a comparative evaluation shows the success of cHS.
The optimization techniques have the utility of navigating the search space using effective operators driven by control parameters. The tricky point of the success of any optimization method is its ability to strike a suitable balance between exploration (diversification) and exploitation (intensification) of the problem search space [
Harmony search (HS) algorithm is a recent evolutionary algorithm (EA) proposed by Geem et al. [
In a procedural context, HS, which is an iterative improvement algorithm, initiates with a population of random individuals stored in harmony memory (HM). At each iteration, a new individual is generated based on three operators: (i) memory consideration, which
As other EAs, HS algorithm interacts with whole individuals in the HM during each breeding step. The update process selects the worst individual from a single HM and replaces it with a new one, if better. The decentralized methods used in other structured EAs have shown that the performance of EA is improved. Examples include cellular genetic algorithm (cGA) [
Cellular genetic algorithm (cGA), in particular, is a decentralized method where the population is represented as a toroidal grid of twodimensions, as shown in Figure
The main objective of this paper is to embed the cellular automata concepts in the HS algorithm optimization framework, where the HM is arranged as a twodimensional toroidal grid. The population diffusion will be expectably preserved, maintaining a highlevel of diversity during the search, thus avoiding genetic drift. The improvisation process of HS algorithm is adjusted to interact with the neighborhoods of specific individuals. The updating process of HM is done within the neighborhoods of that individual. The results show that the new decentralized version of HS (i.e., cHS) algorithm improves the performance of HS using standard benchmark functions.
The rest of the paper is organized as follows. The basics of HS algorithm are described in Section
The harmony search (HS) algorithm is a recent evolutionary approach proposed by Geem et al. [
The flowchart of the HS algorithm.
The optimization problem is initially represented as
The harmony memory consideration rate (HMCR), used in the breeding step to determine whether the value of a decision variable is to be selected from the individuals stored in the harmony memory (HM).
The harmony memory size (HMS) which determines the number of individuals in HM.
The pitch adjustment rate (PAR), which is used to decide the adjustments of some decision variables selected from memory.
The distance bandwidth (BW) which determines the distance of the adjustment that occurs to the individual in the pitch adjustment operator.
The number of improvisations (NI) which is similar to the number of generations.
The harmony memory (HM) is a matrix of size
The HS algorithm generates a new individual,
In pitch adjustment, if the decision for
If the new individual,
The HS algorithm will repeat Steps
There has been much interest from researchers and scientists from different fields exploiting the cellular automata (CA) in physics, biology, social science, computer science, and so on. The initial concepts of CA were developed by Neumann [
The concepts of cellular automata (CA) are normally concerned with individual perspective. The main idea from CA is to provide a population of a particular structure formulated as a toroidal grid. The cell in the toroidal grid refers to an individual who communicates with his closest neighboring individuals so that all the individuals have exactly the same number of neighbors. This leads us to a kind of locality known as
There exist twodifferent kinds of cellular models based on how the breeding cycle is performed to the individuals. To put it differently, if the cycle is performed to the whole individuals at the same time, the cellular model is said to be synchronous, where the individuals of the next generation are simultaneously build. On the other hand, if the individuals of the population are sequentially updated with a particular order policy, an asynchronous cellular model is stated. For more discussion about the theory of cellular automata, relevant papers can be seen in [
Cellular harmony search (cHS) algorithm can be considered as a new decentralized variation of HS which hinges on the structured HM. The individuals in the HM are arranged in the form of twodimensional toroidal grid. This is meant to keep a high level of diversity during the breeding step and thus increases the chance to converge into global minima. This can be achieved by avoiding the genetic drift and providing a more suitable population diffusion during the search.
Some steps of cHS algorithm to the original version of HS algorithm presented in Section
Cell: the selected random individual in the population (the number of individual is HMS).
Cell space: the set of the whole individuals in HM.
Neighborhood: the set of potential mates of any individual.
Neighborhood shapes: the way of selecting the neighborhoods of the cell as seen in Figure
Discrete time limit: the number of generations in HS algorithm which is normally determined by NI parameter.
The flowchart of cHS.
The population arrangement based on cellular structure where each cell is an individual in HM.
The detailed steps of cHS are discussed in the steps below.
It is clear that the successful search of any metaheuristic method is based on skillful parameter setting. The parameters have different effects on optimization solutions. The parameters of cHS are harmony memory size (HMS), harmony memory consideration rate (HMCR), pitch adjusting rate (PAR), number of improvisation (NI), and the size of neighborhood (NH) determined by cellular structure (see Figure
This step is the same as Step
The individuals of HM mapped into the toroidal mesh
Neighboring shapes [
NM is a binary matrix of size
The
Figure
To map the element
To map the index of the individual
This mapping mechanism between the individual index of HM and the elements in
As can be noted, the individuals in the set
The selection of a random individual
In this step, a new individual,
The processes fluency of cHS algorithm.
The pitch adjustment and random consideration operators in cHS algorithm are the same as those in the original version of HS algorithm. However, the memory consideration is modified to be inline with the concepts of cellular automata as follows.
It is worth mentioning that cellular memory consideration is able to control the diffusion between the individuals in HM, and thus, it is able to preserve the cHS diversity as long as the search process is iterated. By this strategy, the population is structured and it is possible to improve the numerical behavior of the cHS algorithm.
This step is modified in cHS algorithm. The worst individual from the set of neighbors
In this step, the cHS will stop if the maximum number of the iteration (i.e., NI) is reached; otherwise the algorithm repeats Steps
Set HMCR, PAR, NI, HMS, bw.
Calculate
Generate(
select random individual
find(
include
exclude
In this section, cHS algorithm is evaluated using benchmark functions circulated in the literature used to evaluate different variations of HS algorithm. The comparative evaluation is demonstrated while the sensitivity analysis of the control parameters of the proposed method is carried out.
In this section, a set of test functions designed for the special session on realparameter optimization organized in the 2005 IEEE Congress on Evolutionary Computation (CEC 2005) [
CEC’2005 functions.
Abb.  Function name  Search range 




Shifted sphere function 


−450 

Shifted Schwefel's problem 1.2 


−450 

Shifted rotated high conditioned elliptic function 


−450 

Shifted Schwefel's problem 1.2 with noise in fitness 


−450 

Schwefel's problem 2.6 with global optimum on bounds 


−310 

Shifted Rosenbrock's function 


390 

Shifted rotated Griewank's function without bounds 


−180 

Shifted rotated Ackley's function with global optimum on bounds 


−140 

Shifted rastrigin's function 


−330 

Shifted rotated Rastrigin's function 


−330 

Shifted rotated Weierstrass function 


90 

Schwefel's problem 2.13 


−460 

Expanded extended Griewank's plus Rosenbrock's function (F8F2) 


−130 

Shifted rotated expanded Scaffer's F6 


−300 

Hybrid composition function 


120 

Rotated hybrid composition function 


120 

Rotated hybrid composition function with noise in fitness 


120 

Rotated hybrid composition function 


10 

Rotated hybrid composition function with a narrow basin for the global optimum 


10 

Rotated hybrid composition function with the global optimum on the bounds 


10 

Rotated hybrid composition function 


360 

Rotated hybrid composition function with high condition number matrix 


360 

Noncontinuous rotated hybrid composition function 


360 

Rotated hybrid composition function 


260 

Rotated hybrid composition function without bounds 


260 
Key to CEC’2005 comparative methods.
Key  Method name  Reference 

BLXGL50  Hybrid realcoded genetic algorithms with female and male differentiation  [ 
BLXMA  Adaptive local search parameters for realcoded memetic algorithms  [ 
CoEVO  Realparameter optimization using the mutation step coevolution  [ 
DE  Realparameter optimization with differential evolution  [ 
DMSLPSO  Dynamic multiswarm particle swarm optimizer with local search  [ 
EDA  Experimental results for the special session on realparameter optimization at CEC’2005: a simple, continuous EDA  [ 
KPCX  A populationbased, steadystate procedure for realparameter optimization  [ 
GCMAES  A restart CMA evolution strategy with increasing population size  [ 
LCMAES  Performance evaluation of an advanced local search evolutionary algorithm  [ 
LSaDE  Selfadaptive differential evolution algorithm  [ 
SPCPNX  Realparameter optimization performance study on the CEC2005 benchmark with SPCPNX  [ 
Average error rate obtained in CEC’2005 special session in dimension 10.
Algorithm 










BLXGL50 









BLXMA 









COEVO 









DE 









DMSLPSO 









EDA 









IPOPCMAES 









KPCX 









LRCMAES 









LSADE 









SPCPNX 









HS 









cHS 









 
Algorithm 









 
BLXGL50 









BLXMA 









COEVO 









DE 









DMSLPSO 









EDA 









IPOPCMAES 









KPCX 









LRCMAES 









LSADE 









SPCPNX 









HS 









cHS 









 
Algorithm 








 
BLXGL50 








BLXMA 








COEVO 








DE 








DMSLPSO 








EDA 








IPOPCMAES 








KPCX 








LRCMAES 








LSADE 








SPCPNX 








HS 








cHS 







The experiments done followed the conditions of CEC 2005 [
Notably, most of the winner comparative methods are hybrid versions of a particular EA, where their results are very efficient to the tested functions. It is appeared that the AE of HS and cHS are very close or sometimes better than those achieved by the comparative methods. In particular, the HS algorithm is able to achieve very powerful results for most test functions and excels some of the best results reported by the comparative methods. For example, HS has achieved the smallest
It is noted that the
All the experiments are run using a computer with 2.66 Intel Core 2 Quad with 4 GB of RAM. The operating system used is Microsoft windows Vista Enterprise Service Pack 1. The source code is implemented using MATLAB Version 7.6.0.324(R2008a).
The common parameters among all algorithms used in the experiments are set based on empirical guidelines [
All functions are implemented in 30 dimensions (30D). For the scalability study in Section
All the results in this section are presented in Tables
The effect of varying the HMCR parameter on HS and cHS for ten functions (
Function  Alg.  0.7  0.9  0.94  0.98 

Sphere  cHS 









HS 










 
Schwefel's 2.22  cHS 









HS 










 
Step  cHS 









HS 



 





 
Rosenbrock  cHS 









HS 










 
Rotated hyper  cHS 









HS 










 
Schwefel's 2.26  cHS 









HS 










 
Rastrigin  cHS 









HS 










 
Ackley  cHS 









HS 



 





 
Griewank  cHS 









HS 










 
SixHump  cHS 









HS 









The global minimization benchmark functions are used to study the sensitivity analysis of the parameters of the proposed method (cHS) against the original version of HS algorithm. Five functions are defined by Whitley et al. [
Most of the benchmark functions have standard solution space range of the objective function. Otherwise, unsymmetrical initialization ranges are used for these functions whose global optima are at the center of the solution space. These benchmark functions are as follows.
The benchmark functions landscape where the value of
2D sphere function
2D Schwefel's problem 2.22
2D step function
2D Rosenbrock function
2D Rotated hyperellipsoid function
2D Schwefel's problem 2.26 function
2D Rastrigin function
2D Ackley function
2D Griewank function
SixHump CamelBack function
Figure
Table
The box plots for showing the effect varying HMCR values using the ten global optimization functions.
2D sphere function
2D Schwefel's problem 2.22
2D step function
2D Rosenbrock function
2D Rotated hyperellipsoid function
2D Schwefel's problem 2.26 function
2D Rastrigin function
2D Ackley function
2D Griewank function
SixHump CamelBack function
The results show that increasing the HMCR value improves the performance of the cHS for all functions, except sixhump where the opposite function is true. Where a small value is used, HMCR increases the diversity and hence prevents the cHS from convergence (i.e., it results in random search). Thus, it is generally better to use a large value for the HMCR (i.e., ≤0.98).
The high value of HMCR means high probability of using the harmony memory that leads to less exploration of search space. Using a probability of HMCR close to 1 (high value) might lead the algorithm to fall into the local minima. Using less probability of HMCR allows more randomly generated solutions. Therefore, the diversity increases in a way that prevents the convergence. The results reveal that cHS and HS have identical sensitivity to the different values of HMCR for all functions, and in 0.98 probabilities to use HMCR they get the best results for the majority of optimization functions. Furthermore, the results produced by cHS are better than those produced by HS in almost all tested functions.
Table
The effect of varying the HMS parameter on HS and cHS for ten functions (
Function  Alg.  16  25  36  100 

Sphere  cHS 









HS 










 
Schwefel's 2.22  cHS 









HS 










 
Step  cHS 









HS 










 
Rosenbrock  cHS 









HS 










 
Rotated hyper  cHS 









HS 










 
Schwefel's 2.26  cHS 









HS 










 
Rastrigin  cHS 









HS 










 
Ackley  cHS 









HS 










 
Griewank  cHS 









HS 










 
SixHump  cHS 









HS 









The box plots for showing the effect varying HMS values using the ten global optimization functions.
2D sphere function
2D Schwefel's problem 2.22
2D step function
2D Rosenbrock function
2D Rotated hyperellipsoid function
2D Schwefel's problem 2.26 function
2D Rastrigin function
2D Ackley function
2D Griewank function
SixHump CamelBack function
It is revealed that both cHS and the basic HS are not sensitive to the HMS. For (
HM is analogous to the shortterm memory of a musician that is known to be small. A plausible interpretation may rely on the high number of similar harmonies within the HM when the HMS is large that leads to shortages of diversity and, hence, lead to falling into local minima. Therefore, cHS is likely to be capable of maintaining the diversity than HS with the cellular structure.
Table
The effect of varying the PAR parameter on HS and cHS for ten functions (
Function  Alg.  0.1  0.3  0.7  0.9 

Sphere  cHS 









HS 










 
Schwefel's 2.22  cHS 









HS 










 
Step  cHS 









HS 










 
Rosenbrock  cHS 









HS 










 
Rotated hyper  cHS 









HS 



 





 
Schwefel's 2.26  cHS 









HS 










 
Rastrigin  cHS 









HS 









 
Ackley  cHS 









HS 










 
Griewank  cHS 









HS 










 
SixHump  cHS 









HS 









The box plots for showing the effect varying PAR values using the ten global optimization functions.
2D sphere function.
2D Schwefel's problem 2.22
2D step function
2D Rosenbrock function
2D Rotated hyperellipsoid function
2D Schwefel's problem 2.26 function
2D Rastrigin function
2D Ackley function
2D Griewank function
SixHump CamelBack function
It seems that using a relatively small value of PAR (i.e., ≤0.5) improves the performance of the cHA and HS. Most results at large value of PAR can increase the convergence speed of HS algorithm, while a small value of PAR increases diversity in HM. On the other hand, the small value of PAR allows more exploration of the search space, and the large value of PAR leads to a lower rate of exploration, where the diversity is reduced and then the algorithm might be trapped into the local optima. It is observed that cHS is able to get the best results than the original version of HS algorithm for some benchmark functions (i.e.,
Table
The effect of varying the number of neighbors on cHS for ten functions (
Function  NH = 4  NH = 8  NH = 12 

Sphere 





 
 
Schwefel's 2.22 





 
 
Step 





 
 
Rosenbrock 







 
Rotated hyperellipsoid 





 
 
Schwefel's 2.26 





 
 
Rastrigin 







 
Ackley 







 
Griewank 







 
SixHump CamelBack 






Mean and standard deviation of ten functions (
Function  Alg. 





Sphere  cHS 









HS 










 
Schwefel's 2.22  cHS 









HS 










 
Step  cHS 









HS 



 



 
 
Rosenbrock  cHS 







 
HS 



 



 
 
Rotated hyperellipsoid  cHS 









HS 








 
 
Schwefel's 2.26  cHS 









HS 



 





 
Rastrigin  cHS 









HS 



 





 
Ackley  cHS 







 
HS 








 
 
Griewank  cHS 









HS 



 



 
 
SixHump CamelBack  cHS 









HS 









The box plots for showing the effect varying NH values using the ten global optimization functions.
2D sphere function
2D Schwefel's problem 2.22
2D step function
2D Rosenbrock function
2D Rotated hyperellipsoid function
2D Schwefel's problem 2.26 function
2D Rastrigin function
2D Ackley function
2D Griewank function
SixHump CamelBack function
The results show that cHS obtained the best result when the number of neighbors is large, except (
In this section, the results produced by cHS and HS, when the dimension of the function is set to
In general, decreasing the dimensionality leads to better results in cHS and HS. This comes in line with the previous theory. However, it is observed from results (Table
In this paper, a new version of HS algorithm called cellular harmony search (cHS) algorithm is proposed. cHS is an HS algorithm embedded with cellular automata concepts. The main idea of proposing cHS algorithm is to provide a structured population that preserves a high level of diversity during the search. In cHS, the HM individuals are arranged as a twodimensional toroidal grid, where each individual is generated and only interacts with its neighboring individuals. The operators of the original version of HS algorithm are adjusted to observe the cellular GA theory, where the concepts of cell and cell search space are employed.
Using ten global optimization functions circulated in the literature, the cHS is evaluated. The results support the theory of cellular automata, where in almost all cases the cHS outperforms HS algorithm. The sensitivity analysis of cHS parameters has suggested that the cHS is sensitive to the values of HMCR, PAR,
This is an initial investigation of using cellular automata concepts in the HS algorithm optimization framework. The future, indeed, is pregnant with several research directions, such as
analyzing the selection pressure and time complexity concepts of cHS algorithm,
studying the effect of the neighborhood shapes on the performance of cHS,
studying a new migration strategy to empower the interaction between the individuals and their neighbors.
The first author is grateful to be awarded Postdoctoral Research Fellowship from the School of Computer Sciences (USM). This paper is supported by Grant no. 203/PTS6728001 (Grant Type: LRGS).