Adaptive Central Force Optimization Algorithm Based on the Stability Analysis

In order to enhance the convergence capability of the central force optimization (CFO) algorithm, an adaptive central force optimization (ACFO) algorithm is presented by introducing an adaptive weight and defining an adaptive gravitational constant. The adaptive weight and gravitational constant are selected based on the stability theory of discrete time-varying dynamic systems. The convergence capability of ACFO algorithm is compared with the other improved CFO algorithm and evolutionary-based algorithm using 23 unimodal and multimodal benchmark functions. Experiments results show that ACFO substantially enhances the performance of CFO in terms of global optimality and solution accuracy.


Introduction
Consider the following global optimization problem: max  () where () : Ω ⊂    →  is a real-valued bounded function and  min ,  max , and  are   -dimensional continuous variable vectors.Such problem arises in many applications, for example, in risk management, applied sciences, and engineering design.The function of interest may be nonlinear and nonsmooth which makes the classical optimization algorithms easily fail to solve these problems.
These search methods all simulate biological phenomena.Different from these algorithms, some heuristic optimization algorithms based on physical principles have been developed, for example, simulating annealing (SA) algorithm [7], electromagnetism-like mechanism (EM) algorithm [8], central force optimization (CFO) algorithm [9], gravitational search algorithm (GSA) [10], and charged system search (CSS) [11].SA simulates solid material in the annealing process.EM is based on Coulomb's force law associated with electrical charge process.GSA and CFO utilize Newtonian mechanics law.CSS is based on Coulomb's force and Newtonian mechanics laws.Unlike other algorithms, CFO is a deterministic method.In other words, there is not any random nature in CFO, which attracts our attention on the CFO algorithm in this paper.CFO, which was introduced by Formato in 2007 [9], is becoming a novel deterministic heuristic optimization algorithm based on gravitational kinematics.In order to improve the CFO algorithm, Formato and other researchers developed many versions of the CFO algorithm [12][13][14][15][16][17][18][19][20][21][22][23].In [12,13], Formato proposed PR-CFO (Pseudo-Random CFO) algorithm.The improved implementations are made in three areas: initial probe distribution, repositioning factor, and decision space adaptation.Formato presented an algorithm known as PF-CFO (Parameter Free CFO) in [14,15].PF-CFO algorithm improves and perfects the PR-CFO algorithm in the aspect of the selection of parameter.Mahmoud proposed an efficient global hybrid optimization algorithm combining the CFO algorithm and the Nelder-Mead (NM) method in 2 Mathematical Problems in Engineering [16].This hybrid method is called CFO-NM.An extended CFO (ECFO) algorithm was presented by Ding et al. by adding the historical information and defining an adaptive mass in [17] where the convergence of ECFO algorithm was proved based on the second order difference equation.
In aforementioned CFO algorithms, two updated equations were used: one for a probe's acceleration and the other for its position.In the probe's position updated equation which is established based on the laws of motion, the velocity is defined as zero.But the velocity has influence on the exploring ability of the CFO algorithm.Therefore, in this paper, we introduce the velocity in the probe's position updated equation, which leads us to build a velocity updated equation like the CSS algorithm.Since the weight which can balance the global and local search ability is an important parameter in many heuristic algorithms, we introduce weight in probe's position updated equation.If the value of weight is too large, then the probes may move erratically, going beyond a good solution.On the other hand, if weight is too small, then the probe's movement is limited and the optimal solution may not be reached.Therefore, an appropriate dynamically changing weight can improve the performance of the heuristic algorithm.However, in most of the heuristic algorithms, the changing weight was selected empirically according to the characteristics of the problems without theoretical analysis.The gravitational constant has the same effect on the weight in CFO algorithm.Hence, this paper will further investigate the weight and gravitational constant settings by employing the geometry-velocity stability theory of discrete time-varying dynamic systems.Based on the above discussion, an adaptive CFO (ACFO) algorithm is proposed in this paper.
To the best of our knowledge, there is no research on stability analysis of the CFO algorithm till now.In this paper, the stability of the ACFO algorithm is analyzed based on discrete time-varying dynamic systems theory.Based on the stability analysis of the proposed algorithm, we explore the weight and gravitational constant settings.
The rest of this paper is organized as follows.Section 2 will present the basics of the CFO algorithm and review the state of the art concerning the algorithm.In Section 3, we propose an adaptive central force optimization.Some numerical results are given to test the performance of the proposed algorithm in Section 4. Finally, we have some conclusions about the proposed algorithm.

Central Force Optimization
CFO solves problem (1) based on the movement of probes through the decision space (DS) along trajectories computation by utilizing the gravitational analogy.The DS is defined by In CFO, a group of probes are represented as potential solutions, and each probe  is associated with position vector   () and acceleration vector   () at time step .The position of each probe is initialized by a variable initial probe distribution formed by deploying   /  probes uniformly on each of the probe lines parallel to the coordinate axes and intersecting at a point along Ω's principal diagonal, where   is the total number of initial probes.The initial acceleration vectors are usually set to zero.During search process, the acceleration and position of probe  are updated as where where  min  and  max  are the minimum and maximum values for th component of the decision variable. rep is the reposition factor which starts at an arbitrary initial value  initial rep < 1 and is incremented by an arbitrary amount Δ rep at each iteration.If  rep ≥ 1, then it is reset to the starting value.In order to improve convergence speed, the DS size is adaptively reduced around the best probe  best .The DS's boundary coordinates are reduced as follows: The termination criterion is that iterations reach their maximum limit   .We also terminate the CFO algorithm early if the difference between the average best fitness over  steps (including the current step) and the current best fitness is less than 10 −6 .
In order to improve the CFO algorithm, Formato proposed modifications to CFO algorithm, namely, PR-CFO [12,13].The steps of PR-CFO algorithm [13] are shown as follows; For   /  = (  /  ) start to (  /  ) max step size is 2.
For  =  start to  stop by Δ (a.1) compute initial probe distribution with distribution factor ; (a.2) compute initial fitness matrix; select the best probe fitness; (a.3) assign initial probe acceleration; (a.4) set initial  rep =  init rep .

End If
Next  (h) reset Ω's boundaries to their starting values before shrinking. Next PR-CFO is further modified in order to create an algorithm known as PF-CFO (Parameter Free CFO) [14,15].This version is almost identical to PR-CFO and compensates for the number of parameters that must be chosen by fixing a wide array of internal parameters at specific values [19].The values of parameters borrowed from [19] that are used in PF-CFO algorithm can be seen in Table 1.

Adaptive Central Force Optimization Algorithm
Qian and Zhang [23] proposed an adaptive central force optimization algorithm.In [23], the time Δ in (3) was updated based on the fitness value compared with the average fitness value.In this paper, we introduce adaptive weight in position update equation, adaptive gravitational constant in acceleration update equation and the velocity update formula.The weight and gravitational constant are updated based on this stability analysis of a discrete timevarying dynamic system.In ACFO algorithm, the position, acceleration, and velocity of probe  are updated as follows: where   () is the weight;   () is a gravitational constant at probe 's position at iteration ; Δ = 1;  and  are the parameters;  ( = 1, 2, . . .,   ) is the coordinate number; and   is defined as follows: where   is the Euclidian distance between two probes,  and , and  is a radius constant.Let It is clear that   () and () are nonnegative.By ( 8), (9), and (10), the position and velocity updated equations can be written as follows: Equations ( 13) are written in matrix form as follows: Let Equation ( 14) can be expressed as a discrete time-varying dynamic system as follows: Lemma 1 (see [24]).Let (+1) = (, ()) be a discrete timevarying dynamic system; if (, ()) satisfies the condition      (,  ())     ≤ V () ‖ ()‖  + , under a certain vector norm ‖ ⋅ ‖  , then the system is geometryvelocity stable in the bounded set  = { | ‖‖  < /(1 − )}, where  is constant and 0 ≤ V() ≤  < 1.
Cui and Zen presented a selection of the parameters in PSO algorithm based on Lemma 1 in [24].Now we analyze the stability of ACFO algorithm and give a selection of weight and gravitational constant based on Lemma 1.
From the above discussion, in order to guarantee the geometry-velocity stability of system ( 16), parameters   () and   () are selected as follows: where rand(0, 1) is a random number in the interval [0, 1].However, CFO algorithm is a deterministic method which should not contain any random nature.Therefore, we take parameters   () and   () as follows: where  and  are two constants between 0 and 1.
The specific iterative steps of ACFO algorithm are listed as follows.
For  =  start to  stop by Δ (a.1) compute initial probe distribution with distribution factor ; (a.2) compute initial fitness matrix; select the best probe fitness; (a.3) assign initial probe's accelerations and velocities; (a.4) set initial  rep =  init rep and  init .
In ACFO algorithm, the initial acceleration and velocities vectors are set to zero and ⃗  = ∑   =1 0.01 ⃗   , where ⃗   is the unit vector along the -axis.

Numerical Experiments
In this section, the performance of ACFO algorithm is compared with the existing algorithms, GSO, GA, PSO, PR-CFO, PF-CFO, CFO-NM, and EPSO, using a suite of the former twenty-three benchmark functions provided in [25].In ACFO algorithm, internal parameters  start = 0 and Δ = 0.1.Other internal parameters are the same as the parameters of RF-CFO algorithm except parameter   = 500, which are listed in Table 1.We choose other parameters  = 0.9,  = 1, and  = 0.01.In our experiment, the codes were written in MATLAB 7.0 and run on PC with 2.00 GB RAM memory, 2.10 GHz CPU, and Windows 7 operation system.The stopping condition is that iterations reach their maximum limit   .We also early stop the ACFO algorithm if the difference between the average best fitness over 30 steps (including the current step) and the current best fitness is less than 10 −6 .In Table 2, ,   ,  min , and  eval stand for the test function, the dimension of decision space, the optimum minimum value for each function, and the total number of function evaluations, respectively.The statistical data in Table 2 for RP-CFO and RF-CFO are reproduced from [13] and [15], respectively.The best fitness is the optimum maximum (note the negative of each of the benchmark functions).The set of twenty-three benchmark functions are divided into unimodal functions ( 1 to  7 ), multimodal functions ( 8 to  13 ), and low-dimensional multimodal functions ( 14 to  23 ).Table 3 summarizes the obtained optimum minimum results which are compared with other optimization algorithms, such as GA, PSO, GSO, CFO-NM, and ECFO.The statistical data for CFO-NM and ECFO is from [16,17] while the other statistical data is from [5].In Tables 2 and 3, the star * denotes that the numerical result is the best one among all the comparative algorithms.In Table 3, the symbol "-" means that the problem is not calculated in the original references.
From Table 2, it is clearly seen that the ACFO algorithm yields significantly better performance than PR-CFO algorithm on benchmark functions  1 - 5 .But the ACFO algorithm has a worse result on  7 and same result on  6 compared to PR-CFO algorithm.From the comparisons between ACFO algorithm and PF-CFO algorithm, we can find that ACFO algorithm performs better than PF-CFO algorithm on  3 and  5 and obtains the same results yielded by PF-CFO algorithm on  1 ,  2 ,  4 ,  6 , and  7 .But it should be noted that both the ACFO and PF-CFO algorithm can obtain optimum minimum value of  1 ,  2 ,  4 , and  6 .
The set of benchmark functions  8 - 13 are multimodal functions with many local minima.From Table 2, we can see that the ACFO algorithm outperforms PR-CFO and PF-CFO algorithm except functions  8 ,  12 , and  13 , but PR-CFO and PF-CFO algorithm are superior to ACFO algorithm on benchmark function  8 by a very small percentage of 6.683 − 07 and 6.285 − 07, respectively.
The other set of benchmark functions  14 - 23 are low dimensions multimodal functions.From the comparison shown in Table 2, it can be obviously seen that the best fitness generated by ACFO, PR-CFO, and PF-CFO algorithm are almost the same on  14 - 23 .
From Table 2, we can also see that ACFO algorithm is superior to the PR-CFO and PF-CFO algorithms for the total number of function evaluations except functions  6 and  17 .In Table 3, we can clearly see that the ACFO algorithm outperforms GA, PSO, GSO, CFO-MN, and ECFO algorithms on test functions  1 - 7 .The only exception is function  7 in which the ECFO algorithm is superior to ACFO algorithm.It is seen that, for test functions  8 - 13 , ACFO algorithm performs better than GA and PSO algorithms except function  12 .In addition, ACFO algorithm outperforms GSO, CFO-NM, and ECFO on functions  9 - 11 .For functions  14 - 23 , we can also find that the ACFO algorithm generates better results than the GA and PSO.The only exception is function  15 in which the ACFO algorithm yields a similar result compared to PSO.From the comparisons between ACFO and GSO algorithm, we can see that the ACFO algorithm outperforms GSO algorithm on functions  20 - 23 and has similar results to GSO algorithm on functions  14 - 19 .In addition, ACFO algorithm has also similar result to CFO-NM algorithm on functions  16 - 18 .From the comparisons between ACFO and other algorithms, it is found that the ACFO algorithm performs better than the other algorithms.
Figures 1-7 show only convergence curves of PR-CFO, PF-CFO, and ACFO algorithms on  1 - 7 .The vertical axis is the logarithmic function value of (1 + best function value), and horizontal axis is the number of iterates.From Figures 1-6, we can obviously see that ACFO algorithm tends to find the global optimum faster than other algorithms and hence has a higher convergence rate.According to Figure 7, ACFO and PR-CFO algorithms have similar convergence rates, but ACFO algorithm has good convergence rate compared with PF-CFO algorithm.

Conclusion
This paper presents ACFO algorithm which enhances the convergence capability of the CFO algorithm.The ACFO algorithm introduces a weight and updates the equation that generates the probe's position.Based on the stability theory of discrete time-varying dynamic systems, we define adaptive weight and gravitational constant.In order to test ACFO algorithm performance, ACFO algorithm is compared with improved CFO algorithms and other state-of-the-art algorithms.The simulation results show that the ACFO is better than other algorithms.

Figure 7 :
Figure 7: Convergence curves of three algorithms on  7 .

Table 1 :
The values of parameters that are used in PF-CFO algorithm.