An Improved Cuckoo Search Optimization Algorithm for the Problem of Chaotic Systems Parameter Estimation

This paper proposes an improved cuckoo search (ICS) algorithm to establish the parameters of chaotic systems. In order to improve the optimization capability of the basic cuckoo search (CS) algorithm, the orthogonal design and simulated annealing operation are incorporated in the CS algorithm to enhance the exploitation search ability. Then the proposed algorithm is used to establish parameters of the Lorenz chaotic system and Chen chaotic system under the noiseless and noise condition, respectively. The numerical results demonstrate that the algorithm can estimate parameters with high accuracy and reliability. Finally, the results are compared with the CS algorithm, genetic algorithm, and particle swarm optimization algorithm, and the compared results demonstrate the method is energy-efficient and superior.


Introduction
Chaos is a universal complex dynamical phenomenon, lurking in many nonlinear systems, such as communication systems and meteorological systems. The control and synchronization of chaos has been widely studied [1][2][3][4]. Parameter estimation is a prerequisite to accomplish the control and synchronization of chaos. During recent years many parameter estimation methods have been proposed, such as particle swarm optimization (PSO) [5][6][7][8], genetic algorithm (GA) [9][10][11][12], and mathematical methods of multiple shooting [13]. However, the GA and PSO algorithms are easily trapped into local-best solution that affects the quality of solutions; the precisions of PSO, GA, and multiple shooting are not high enough. Recently, a novel and robust metaheuristic based method called cuckoo search algorithm was proposed by Yang and Deb [14][15][16]. The algorithm proved to be very promising and could outperform existing algorithms such as GA and PSO [14]. However, the relatively poor ability of local searching is a drawback, and it is necessary to further improve the performance of CS algorithm to obtain a higherquality solution. The basic principle of the ICS algorithm is to integrate the orthogonal design and simulated annealing operation to enhance the exploitation optimization capacity.
The remaining sections of this paper are organized as follows. In Section 2, a brief formulation of chaotic system parameters estimation is described. Section 3 elaborates the ICS algorithm, and the results established upon the proposed algorithm and some compared algorithms are given in Section 4. The paper ends with conclusions in Section 5.

Problem Formulation
A problem of parameter estimation can be converted into a problem of multidimensional optimization by constructing the proper fitness function.
Let the following equation be a continuous nonlineardimension chaotic system: where = ( 1 , 2 , . . . , ) ∈ denotes the state vector of the chaotic system,̇is the derivative of , 0 = ( 10 , = ( 1 , 2 , . . . , ) is a set of original parameters. Suppose the structure of the system (1) is known; then the estimated system can be written aṡ wherẽ= (̃1,̃2, . . . ,̃) ∈ denotes the state vector of the estimated system;̃= (̃1,̃2, . . . ,̃) is a set of estimated parameters. In order to convert the parameter estimation problem into optimization problem, the following objective fitness function is defined: where = 1, 2, . . . , is the sampling time point and denotes the length of data used for parameter estimation. The parameter estimation of system (1) can be achieved by searching the most proper values of̃such that the objective function (3) is globally minimized.
It can be found that (3) is a multidimensional nonlinear function with multiple local search optima; it is easily trapped into local optimal solution and the computation amount is great, so it is not easy to search the globally optimal solution effectively and accurately using traditional general methods. In the paper an improved CS algorithm is proposed to solve the complex optimization problem.

Improved CS Algorithm
3.1. Basic CS Algorithm. The basic CS algorithm is based on the brood parasitism of some cuckoo species by laying their eggs in the nests of other host birds. For simplicity in describing the basic CS, the following three ideal rules are used [14]: (1) Each cuckoo lays one egg at a time, and dumps it in a randomly chosen set; (2) the best nests with high-quality eggs will be carried over to the next generations; (3) the number of available host nests is fixed, and the egg laid by a cuckoo is discovered by the host bird with a probability ∈ [0,1]. In this case, the host bird can either get rid of the egg away or simply abandon the nest and build a complex new nest. Based on the above rules, the basic CS algorithm is described as shown in Algorithm 1 [14].
Furthermore, the algorithm used a balanced combination of a local random walk and the global explorative random walk, controlled by a switching parameter . The local random walk can be written as where and are two different solutions selected randomly by random permutation, is a Heaviside function, is a random number drawn from a uniform distribution, and is the step size.
On the other hand, the global random walk is carried out by using Lévy flights [14][15][16][17]: Here, > 0 is the step size scaling factor; Lévy( , ) is the step-lengths that are distributed according to the following probability distribution shown in (6) which has an infinite variance with an infinite mean:

ICS Algorithm.
In order to further improve searching ability of the algorithm, the orthogonal design and simulated annealing operation are integrated into the CS algorithm. The basic idea of the orthogonal design is to utilize the properties of the fractional experiment to efficiently determine the best combination of levels [17]. An orthogonal array of factors with levels and combinations is denoted as ( ), Computational Intelligence and Neuroscience 3 Step 1. Construct the basic columns For = 1 to

End for End for
Step 2. Construct the non-basic columns For = 2 to End for End for End for Step 3. Increment , by one for 1 ≤ ≤ , 1 ≤ ≤ Procedure 1: Procedure constructing the orthogonal array.

Begin
(1) Construct the orthogonal array following the above steps (2) Randomly select two solutions from the population (3) Quantize the domain formed by the two solutions (4) Randomly generate − 1 integers 1 ⋅ ⋅ ⋅ −1 (5) Use ( ) to generate potential offspring (6) Randomly select a solution from the population (7) Compare the solution with the best solution from the orthogonal offspring and obtain the better solution End Algorithm 2: Orthogonal design algorithm.

Begin
(1) Given a configuration of the elements of the system, randomly displace the elements on a time, by a small amount and calculate the resulting change in the energy, Δ (2) If Δ < 0 Then accept the displacement and use the resulting configuration as the starting point for the next iteration Else Accept the displacement with probability (Δ ) = exp(−Δ / ) where is the current temperature and is Boltzmann's constant (3) Repetition of this step continues until equilibrium is achieved End where is the prime number, = , and is a positive integer satisfying = ( − 1)/( − 1). The brief procedure of constructing the orthogonal array ( ) = [ , ] × is described as shown in Procedure 1.
The procedure of the orthogonal design algorithm is elaborated as shown in Algorithm 2 and for more detailed information on the orthogonal design strategy, please refer to [17][18][19].
The procedure of simulated annealing algorithm is simply stated as shown in Algorithm 3 [20], and for more detailed information on the simulated annealing, please refer to [20][21][22].

Computational Intelligence and Neuroscience
Begin (1) Initial the parameter values of the algorithm, generate the random initial vector values and set the iteration number = 1.
(2) Evaluate fitness values of each individual and determine the current best individual with the best objective value.
Check whether the stopping criterion is met. If the stopping criterion is met, then output the best solution; otherwise update the iteration number = + 1 and continue the iteration process until the stopping criterion is met. Based on the above description of the orthogonal design strategy and simulated annealing operation, the detailed procedures for parameter estimation with the ICS algorithm can be summarized as shown in Algorithm 4.

Simulation Results
To demonstrate the effectiveness of the improved algorithm, the algorithm is used to estimate parameters of Lorenz chaotic system [23] and Chen chaotic system [24].

Lorenz Chaotic
System. Lorenz chaotic system equation [23] is expressed as follows: where ( , , ) is the state variables; 1 , 2 , 3 are the unknown chaotic system parameters which need to be estimated. The real parameters of the system are 1 = 10, 2 = 28, and 3 = 8/3 which ensure a chaotic behavior, in order to obtain the values of some state variables, the fourthorder Runge-Kutta algorithm is used to solve (7), and the integral step is ℎ = 0.01. Then a series of state variables values are obtained and 100 state variables of different times ({( ( ), ( ), ( )), = 1, 2, . . . , 100}) are chosen to be the sample data. The parameters of the algorithm are set as follows: the max iteration number is = 200, the sample size is = 100, the annealing mode is shown in (8) where is the iteration number, and the initial temperature is 0 = 100. Consider The objective (fitness) function is shown in (9), where ( ( ), ( ), ( )) is the th state variable that corresponds to the true system parameters and (̃( ),̃( ),̃( )) is the th state variable that corresponds to the estimated system parameters: (9) Figure 1 shows the convergence process of the fitness values and three parameters ( 1 , 2 , 3 ) during the iterations in a single experiment.
In order to eliminate the difference of each experiment, the algorithm is also executed 50 times; then the mean value of the 50 experiments is taken as the final estimated value; the mean value and best value of the 50 experiments are listed in Table 1. The results based on CS (the best parameter setting is = 0.25, = 0.01), PSO (the best parameter setting is = 0.8, = 1.5, where is the inertia weight and is acceleration factor), and GA (the best parameter setting is = 0.8, = 0.1, where is the crossover rate and is the mutation rate) are also listed in Table 1.
It can be seen from Table 1 that the best fitness values obtained by ICS algorithm are quite better than the other algorithms. The mean values of the established parameters are also with higher precision than others. The estimated values are close to the true values infinitely. It can be concluded in general that the ICS algorithm contributes to superior performance, CS performs nest-best, PSO is better than GA, and GA performs worst.
As the actual chaotic systems always associate with noise, in order to test the performance of parameter estimation in the noise condition, the noise sequences are added to the original sample data. The white noise is added to the state variables {( ( ), ( ), ( )), = 1, 2, . . . , 100}; the range of the noise sequences is from −0.1 to 0.1. Figure 2 shows the convergence process of the fitness values and Computational Intelligence and Neuroscience 5   In order to eliminate the difference of each experiment, the algorithm is executed 50 times, then the mean value of the 50 experiments is taken as the final estimated value, and the corresponding results are listed in Table 2. It can be seen from Table 2 that the four algorithms all have a certain capability of identification of parameters, but the performance of ICS is much better than the other algorithms; it supplies more robust and precise results; although the precision of the estimated parameters is declined compared with the results in the noiseless condition, the precision is still satisfactory. Then it can be concluded that the ICS 6 Computational Intelligence and Neuroscience  algorithm possesses a powerful capability for parameters identification in the noise condition.

Chen Chaotic
System. Chen chaotic system equation [24] is expressed as follows: where ( , , ) is the state variables; 1 , 2 , 3 are the unknown chaotic system parameters which need to be estimated. The real parameters of the system are 1 = 35, 2 = 3, and 3 = 28 which ensure a chaotic behavior, the fourth-order Runge-Kutta algorithm is used to solve (10), and the integral step is ℎ = 0.01. Then a series of state variables values are obtained and 100 state variables of different times ({( ( ), ( ), ( )), = 1, 2, . . . , 100}) are chosen to be the sample data. The parameters of the algorithm are set as follows: the max iteration number is = 200, the sample size is = 100, the annealing mode is shown in (8) where is the iteration number, and the initial temperature is 0 = 100. The convergence process of the fitness values and three parameters ( 1 , 2 , 3 ) during the iterations in a single experiment is shown in Figure 3. In order to eliminate the difference of each experiment, the algorithm is executed 50 times, then the mean value of the 50 experiments is taken as the final estimated value, and the corresponding results are listed in Table 2.
It can be seen from Table 3 that the best fitness values obtained by ICS algorithm are quite better than the other algorithms. The mean values of the established parameters are also with higher precision than others. The estimated values are close to the true values asymptotically. It can be concluded in general that the ICS algorithm contributes to superior performance, CS performs nest-best, PSO is better than GA, and GA performs worst.
As the actual chaotic systems always come along with noise, in order to test the performance of parameter  Figure 4 shows the convergence process of the fitness values and three parameters ( 1 , 2 , 3 ) during the iterations in a single experiment under the noise condition.
It can be seen from Table 4 that the four algorithms all have a certain capability of identification of parameters, but the performance of ICS is much better than the other algorithms; it supplies more robust and precise results; although Computational Intelligence and Neuroscience the precision of the estimated parameters is declined compared with the results in the noiseless condition, the precision is still satisfactory. Then it can be concluded that the ICS algorithm possesses a powerful capability for parameters identification in the noise condition.

Conclusion
In this paper, an energy-efficient and superior ICS algorithm is proposed to estimate chaotic system parameters. The estimated results demonstrate the strong capabilities and effectiveness of the proposed algorithm, compared with the CS, PSO, and GA algorithms; the ICS algorithm supplies more robust and precise results. Besides, the algorithm also has a more powerful capability of noise immunity. In general, the proposed ICS algorithm is a feasible, energy-efficient, and promising method for parameters estimation of chaotic systems.