An Improved Chicken Swarm Optimization Algorithm for Solving Multimodal Optimization Problems

To solve the premature convergence problem of the standard chicken swarm optimization (CSO) algorithm in dealing with multimodal optimization problems, an improved chicken swarm optimization (ICSO) algorithm is proposed by referring to the ideas of bacterial foraging algorithm (BFA) and particle swarm optimization (PSO) algorithm. First, in order to improve the depth search ability of the algorithm, considering that the chicks have the weakest optimization ability in the whole chicken swarm, the replication operation of BFA is introduced. In the role update process of the chicken swarm, the chicks are replaced by the same number of chickens with the strongest optimization ability. Then, to maintain the population diversity, the elimination-dispersal operation is introduced to disperse the chicks that have performed the replication operation to any position in the search space according to a certain probability. Finally, the PSO algorithm is integrated to improve the global optimization ability of the algorithm. The experimental results on the CEC2014 benchmark function test suite show that the proposed algorithm has good performance in most test functions, and its optimization accuracy and convergence performance are also better than BFA, artificial fish swarm algorithm (AFSA), genetic algorithm (GA), and PSO algorithm, etc. In addition, the ICSO is also utilized to solve the welded beam design problem, and the experimental results indicate that the proposed algorithm has obvious advantages over other comparison algorithms. Its disadvantage is that it is not suitable for dealing with large-scale optimization problems.


Introduction
Te multimodal optimization problem is a complex function optimization problem with one or more local extrema [1]. In practical applications, there are many multimodal optimization problems, such as parameter estimation and identifcation of models [2,3], engineering structure optimization, welded beam design [4], and medical diagnosis [5]. It is difcult to fnd the global optimum because there are many local extrema in the multimodal optimization problem. Terefore, it is of very importance to study efcient and reasonable algorithms [6,7].
Te swarm intelligence optimization algorithm is a kind of bionic random search algorithm which can solve complex optimization problems by imitating the ecosystem mechanism in nature. Because of its strong global search ability, high efciency, and insensitivity to initial values, it has been widely concerned by relevant researchers [8][9][10]. At present, hundreds of algorithms based on swarm intelligence have emerged, such as genetic algorithm (GA) [11], particle swarm optimization (PSO) algorithm [12], bacterial foraging algorithm (BFA) [13], and artifcial fsh swarm algorithm(AFSA) [14]. In the swarm intelligence optimization algorithm, there are no individuals with centralized control, and the interaction between individuals is extremely simple, but they have the ability to solve complex problems in a short time, which makes them very suitable for solving complex optimization problems in practice. Moreover, the swarm intelligence optimization algorithm does not need the gradient information of the optimization problem, so it belongs to the nongradient optimization algorithm. Terefore, it has a wide range of applications [15,16]. At present, it has been applied in optimization calculation [17,18], workshop scheduling [19,20], image engineering [21], network structure optimization [22], vehicle routing problem [23], control of teleoperating systems, and other felds [24].
As a kind of swarm intelligent algorithm, the chicken swarm optimization (CSO) algorithm was proposed by Meng et al. in 2014 [25]. Due to its good stability and global search ability, the algorithm has attracted extensive attention since it was proposed and has been successfully applied in some felds [26,27]. Liang et al. combined the CSO algorithm with the pulse-coupled neural network for image segmentation. Te experimental results show that this method has obvious advantages in convergence speed and segmentation accuracy [28]. Cristin et al. combined the CSO algorithm with the deep neural network for the classifcation of brain tumors and achieved good performance in terms of accuracy, specifcity, sensitivity, and so on [29].
Although the CSO algorithm has shown good performance in solving many benchmark problems and practical problems, it also has some inherent shortcomings, such as premature convergence and falling into local extrema. Terefore, researchers have proposed many improved algorithms. At present, the improvement of the CSO algorithm can be mainly classifed into three categories as follows: (1) Combination with other swarm intelligent algorithms.
For example, Li et al. improved the search ability of the CSO algorithm by integrating the operation of gray wolf optimization algorithm and PSO algorithm into the CSO algorithm. Te experimental results show that the algorithm is superior to other basic swarm intelligent algorithms in accuracy, convergence speed [30], etc. Sampath fne-tuned the solution of the CSO algorithm by introducing the diferential evolution algorithm to avoid premature convergence and applied the proposed algorithm to solve routing problems [31]. (2) Modifcation of position update formulas of the chicken swarm. For example, Gu et al. improved the CSO algorithm by introducing an adaptive update factor and designing the inertia weight and successfully applied it to parameter estimation of the Richards model [32]. Considering that the imbalance between the diversity and intensifcation of the population may afect the performance of the CSO algorithm, Lin et al. proposed an improved CSO algorithm by modifying the position update formulas of the roosters and hens. Experiments show that this algorithm has obvious advantages over other swarm intelligent algorithms in terms of optimization accuracy and robustness [33]. on some multiobjective benchmark problems and the electric vehicle charging station confguration problem [34].
Te performance of the aforementioned swarm intelligence optimization algorithms has been improved to a certain extent, but there are still some disadvantages. For example, the literature [18] introduced the diferential evolution strategy and quantum behavior into the bird swarm algorithm. Although the convergence speed of the algorithm was enhanced, the problem of premature convergence still existed. In the literature [30], several improved factors are introduced into the position update formulas of the chicken swarm, which improves the ability of the algorithm to jump out of the local extrema to a certain degree. But the optimization accuracy still needs to be further improved.
To solve the abovementioned problems, in this paper, an improved chicken swarm optimization (ICSO) algorithm is proposed which combines the idea of PSO with the replication and elimination-dispersal operations of BFA. More specifcally, our contributions are as follows: (1) Te replication operation of BFA is applied to the chicks with the weakest optimization ability to inherit the optimal food source in the whole chicken swarm, which is proftable to enhance the depth search ability of the algorithm. (2) Te chicks are dispersed to any position in the search space according to a certain probability by using the elimination-dispersal operation of BFA, which is benefcial to improve the population diversity. (3) Te idea of PSO is integrated to improve the global search ability of the algorithm. Te experimental results on CEC2014 standard function test suite preliminarily show that this algorithm has obvious advantages over other swarm intelligent algorithms in optimization accuracy and convergence performance [35]. In addition, compared with other comparison algorithms, it also obtains competitive results in solving the welded beam design problem.

CSO Algorithm
Te CSO algorithm is a meta-heuristic optimization algorithm that simulates the foraging behavior of roosters, hens, and chicks in nature. Te characteristics of the algorithm are as follows: (1) Correspondence with the optimization problem. Te algorithm regards several randomly generated positions in the solution space of the optimization problem as several chickens, and the food source of each chicken is measured by the ftness function value of the corresponding optimization problem. (2) Hierarchical order, that is, the role assignment of the chicken swarm. Te whole chicken swarm is composed of four roles, namely, the roosters, hens, chicks, and mother hens. Teir role assignment is 2 Computational Intelligence and Neuroscience based on the content of food sources. Te rooster has the best food sources, the hens take the second place, the food sources of the chicks are the worst, and the mother hens are randomly selected from the hens. (3) Subgroup division. During the whole foraging process, the chicken swarm is divided into several subgroups. Te number of subgroups is determined by the number of the roosters, because each subgroup is composed of a randomly selected rooster, several hens, and chicks, and there is at least one hen in each subgroup. (4) Relation of dependence. In the foraging process of the chicken swarm, the chicks follow the chicks' mothers (mother hens) and the hens follow the roosters in their subgroups to forage for food. Tey can also steal the good food sources found by other subgroups at random. (5) Information exchange. Te hierarchical order of the chicken swarm and the mother-child relationship between the mother hens and the chicks will be updated after several iterations. Te information exchange between subgroups will be realized through continuous role assignment. (6) Parallel optimization. Te whole chicken swarm can realize parallel optimization through the division of labor and cooperation mechanism between subgroups, quickly fnd the best food sources, and then obtain the solution to the optimization problem. Te formulas used by chickens with diferent roles in foraging are given below.
Te formula used by the roosters in foraging is as follows: where X i,j (t) represents the position of the ith rooster in the t-th iteration, and dim is the dimension of the optimization problem. Randn(0, σ 2 ) is a normal distribution with a mean value of 0 and a standard deviation of σ 2 . ε is a very small number that can be expressed by the computer, which is used to avoid the situation that the denominator is 0 in the formula. f k represents the content of the food source of any other rooster which is diferent from the i th rooster. rNum represents the number of roosters.
Te formula used by the hens in foraging is as follows: where X i,j (t) is the position of the i th hen in the t-th iteration. Rand is a random number function with a value range of (0, 1). X r1,j (t) is the position of the rooster which is in the same subgroup as the i th hen. X r2,j (t) is any chicken randomly selected from the whole chicken swarm, which is diferent from the i th hen, and r 1 ≠ r 2 . Te formula used by the chicks in foraging is as follows: where X i,j (t) represents the ith chick, and X m,j (t) is the position of the chick's mother. FL ∈ (0, 2) is a following coefcient that the chick follows the chick's mother to search for food. Te corresponding basic fowchart is shown in Figure 1.

The Proposed Algorithm
Aiming at the premature convergence of the standard CSO algorithm in solving multimodal optimization problems, an ICSO algorithm is proposed in this paper. Considering that the chicks have the weakest optimization ability in the whole chicken swarm, the algorithm improves the foraging behavior of the chicks by referring to the reproduction and elimination-dispersal operations of BFA. At the same time, considering that the PSO algorithm has good global search ability, on the basis of individual division of labor and cooperation optimization mechanism of the CSO algorithm, a hybrid CSO algorithm is constructed to enhance the global search ability of the algorithm by integrating the PSO algorithm.

CSO Algorithm with Reproduction and Elimination-Dispersal Operations (RECSO Algorithm).
In the standard CSO algorithm, chicks are the most vulnerable group. Terefore, in order to improve the depth search ability of the algorithm, this paper introduces the reproduction and elimination-dispersal operations of BFA into the chicks' foraging behavior, and a RECSO algorithm is proposed. Firstly, in the role update process of the chicken swarm, the replication operation of BFA is introduced to replace the chicks with the same number of chickens with the strongest Computational Intelligence and Neuroscience optimization ability. Trough this behavior, the depth optimization speed of the chicken swarm can be accelerated. At the same time, in order to maintain the population diversity, the elimination-dispersal operation is introduced to disperse the chicks that have performed the replication operation to any position in the search space according to a certain probability. Trough this operation, the chicken swarm can avoid falling into the local extrema. Te specifc reproduction and elimination-dispersal operations are as follows:

Reproduction Operation.
According to the ftness function values of the chicken swarm, let the chicks with poor optimization ability inherit the position of the chickens with the strongest optimization ability. Te formula is as follows: where rNum is the number of roosters and hNum is the number of hens.

Elimination-Dispersal Operation.
Te chicks are dispersed to any position in the search space with the probability P ed , where P ed � 0.25. Te concrete contents are given as follows: end. end.
Here, pop is the population size. lb and ub are the lower and upper bounds of search range, respectively.

CSO Algorithm Based on PSO (CSO-PSO Algorithm).
To improve the global optimization ability of the CSO algorithm, in this section, we construct a CSO-PSO algorithm to enhance the ability of the algorithm to jump out of the local extrema by integrating PSO into the CSO algorithm. Te fowchart of the CSO-PSO algorithm is shown in Figure 2.
Te main steps are described as follows: (1) Population initialization. It mainly involves the parameter settings and determination of initial individual between CSO and PSO algorithms. (2) Fitness evaluation. Te initial swarm optimal value is recorded on the bulletin board.  (3) Subgroup division. Te initial population is divided into two parts with the same scale as follows: subgroup 1 and subgroup 2. (4) CSO. We assign roles to subgroup 1 according to the CSO algorithm and perform the foraging behavior of the chicken swarm to search for the global optimal value. (5) PSO. Te velocity and position of particles in subgroup 2 are updated according to the PSO algorithm to search for the global optimal value. (6) Information exchange. We merge subgroup 1 and subgroup 2 to realize information exchange. (7) Updating swarm optimal value. Te swarm optimal value is updated according to the subgroup optimal values obtained in steps (4) and (5). (8) Werepeat steps (3)-(7) until the maximum number of iterations is reached and the optimal value is output.

ICSO Algorithm.
In view of the premature convergence problem of the standard CSO algorithm in dealing with multimodal optimization problems, an ICSO algorithm is proposed in this section. Considering that the chicks have the weakest optimization ability in the whole chicken swarm, the RECSO algorithm in Section 3.1 is introduced to improve the depth search ability of the algorithm. At the same time, the CSO-PSO algorithm in Section 3.2 is introduced to improve the global search ability of the algorithm. Te fowchart of the ICSO algorithm is shown in Figure 3. Te green part is the improvement strategy of reproduction and elimination-dispersal operations introduced in Section 3.1 and the yellow part is the improvement strategy of the PSO algorithm integrated in Section 3.2.
Te corresponding detailed steps are as follows: (1) Parameter settings. It mainly involves the maximum number of iterations M, the population size pop, the dimension of solution space dim, and the limited number of role updates G. (a) Judgment of the role update condition. Te role update condition of the whole algorithm is mod(t,G) = = 1, where t is the current iteration number, mod is a remainder function. If the condition is false, we jump to step (d); otherwise, we judge whether it is the frst iteration of the algorithm, if so, we go to step (c), if not, we go to step (b). (b) Reproduction and elimination-dispersal operations. We perform the reproduction and elimination-dispersal operations which are described in Section 3.1 on the chicks. (c) Role assignment and subgroup division.
According to the ftness function of the current chicken swarm, we update the hierarchical order and mother-child relationship of the whole chicken swarm. After that, we divide the subgroups and determine the number of subgroups according to the number of the roosters. Each rooster belongs to diferent subgroups. Te hens and chicks are randomly assigned to diferent subgroups, but it is necessary to ensure that there is at least one hen in each subgroup. (d) Foraging behavior. According to (1), (3), and (6), the foraging behaviors are performed by chickens with diferent roles. (e) Update of the optimal food source. At the end of each iteration, the optimization situation of chickens with diferent roles will change accordingly. We calculate the food source content of the current chicken swarm according to the ftness function and record the optimal food source and its corresponding position by comparing them with the previous situation. Te optimal food source and its corresponding position are recorded. (7) Information interaction. We merge subgroups 1 and 2 to realize the information interaction. (8) Update of swarm optimal value. Te swarm optimal value is updated according to the subgroup optimal values obtained in steps (e) and (6). (9) Judgment of the algorithm's termination conditions. If the maximum number of iterations specifed by the algorithm is reached, the optimal value is output and the program operation is ended; otherwise, we jump to step (4).      In aforementioned parameter settings, the parameters of CSO, BFA, and AFSA are set according to the literature [13,25] and [14], where CSO, BFA, and AFSA are proposed. It is worth noting that the reason why N re and N ed are set to 10 in BFA is to ensure that the maximum number of iterations is 10000. Te parameter settings of PSO and GA are derived from the comparison algorithms mentioned in the literature [28,32], respectively.

Simulation Experiment
In the ICSO algorithm, considering that if the value of P ed is large, chicks are easy to fall into random exhaustive search. If the value of P ed is small, it is not conducive to maintaining population diversity, which will reduce the local search ability of the algorithm. Terefore, in this section, to give an appropriate value of P ed , we select four typical functions f 3 , f 6 , f 20 , and f 27 from the CEC2014 function test suite for experiments, where f 3 is a unimodal function, f 6 is a simple multimodal function, f 20 is a hybrid function, and f 27 is a composition function. At the same time, we choose the ICSO algorithm to run these four functions 30 times independently to calculate their mean values. Te experimental results are shown in Table 2.
In Table 2, F * i represents the theoretical global optimal values corresponding to diferent functions, where i � 3, 6, 20, 27. Te value of P ed is taken from 0.05 to 0.85 with an interval of 0.2. |D i | represents the absolute value of the diference between the theoretical global optimal value and the actual mean one obtained on each function, where i � 0.05, 0.25, · · ·, 0.85. It is easy to see from Table 2 that when P ed � 0.25, the value of |D i | is the smallest, that is, the actual mean optimal values obtained by the ICSO algorithm on four functions are the closest to the theoretical ones. Furthermore, when the value of P ed changes from 0.25 to 0.85 with an interval of 0.2, the value of |D i | is larger than that of |D 0.25 |. Terefore, in the experiment, we choose P ed � 0.25.

Te Efectiveness Test of ICSO, RECSO, and CSO-PSO.
To verify the efectiveness of the three improved algorithms proposed in this paper, namely ICSO, RECSO, and CSO-PSO, compared with the standard CSO algorithm, in this section, these four algorithms are tested on the CEC2014 function test suite, and the experimental comparison is made in terms of optimization accuracy and convergence performance. Te experimental parameter settings are shown in Section 4.1.

Te Efectiveness Test for Optimization Accuracy.
To verify the efectiveness of the three improved algorithms (ICSO, RECSO, and CSO-PSO) in terms of optimization accuracy, in this section, we use ICSO, RECSO, CSO-PSO, and CSO to run all 30 functions of the CEC2014 test suite independently 30 times to obtain their mean values. Te results are shown in Table 3, where the symbols ">," "�," and "<" indicate that the experimental results of the comparison algorithms are superior, equal, and inferior to the CSO algorithm, respectively. Te optimal results are shown in bold.
It can be clearly seen from Table 3 that the ICSO algorithm has the largest number of optimal values, followed by CSO-PSO, and the CSO algorithm is the worst. Most specifcally, the operation results of the ICSO algorithm on 28 functions are better than those of the CSO algorithm, and the operation results on 2 functions are the same as those of the CSO algorithm. Te operation results of the RECSO algorithm on 19 functions are better than those of the CSO algorithm. Te operation results of it on 6 functions are worse than those of the CSO algorithm, and its operation results are the same as those of the CSO algorithm on the other 5 functions. Te results of the CSO-PSO algorithm are similar to those of the ICSO algorithm. Tere are 28 functions whose operation results are better than those of the CSO algorithm, and the operation results on 2 functions are the same as those of the CSO algorithm. However, from the number of optimal values obtained, the ICSO algorithm is far superior to the CSO-PSO algorithm. Tis shows the efectiveness of the three improved strategies in terms of optimization accuracy compared with the CSO algorithm, and the performance of the ICSO algorithm is the best among the four algorithms. Shifted and rotated Ackley's function 500 f 6 Shifted and rotated Weierstrass's function 600 f 7 Shifted and rotated Griewank's function 700 f 8 Shifted Rastrigin's function 800 f 9 Shifted and rotated Rastrigin's function 900 f 10 Shifted Schwefel's function 1000 f 11 Shifted and rotated Schwefel's function 1100 f 12 Shifted and rotated katsuura function 1200 f 13 Shifted and rotated HappyCat function 1300 f 14 Shifted and rotated HGBat function 1400 f 15 Shifted and rotated expanded Griewank's plus Rosenbrock's function 1500 f 16 Shifted and rotated expanded Scafer's F6 function 1600 Hybrid functions   To verify the efectiveness of the abovementioned three improved algorithms in terms of convergence performance compared to the CSO algorithm, Figure 4 shows the average convergence curves of the four algorithms independently running 30 times on 30 functions. In order to show the convergence efect of each algorithm more clearly, the average ftness function values are logarithmically processed in Figure 4. In addition, the convergence curves of some functions contain subgraphs, such as those of f 4 , f 8 , and f 13 , which are locally magnifed renderings. It can be seen from Figure 4 that the convergence performance of the ICSO algorithm on functions f 1 , f 5 , f 9 , f 10 , f 11  To sum up, the convergence performance of the ICSO algorithm on 27 functions is the best, and it is better than that of the CSO algorithm on all 30 functions. Te CSO-PSO algorithm has the best convergence performance on one function, and its convergence performance on 28 functions is better than that of the CSO algorithm. Te RECSO algorithm has the best convergence performance on two functions, and only the convergence performance on functions f 8 , f 13 , and f 17 is slightly inferior to that of the CSO algorithm. Tis fully shows the efectiveness of the three improved algorithms in terms of convergence performance compared with the CSO algorithm, and the ICSO algorithm has the best convergence performance.
It is concluded that the reason why the ICSO algorithm is superior to the other three algorithms may be that the algorithm integrates the ideas of BFA and PSO, which not only improves the depth search ability of the algorithm but also improves its breadth searchability.

Te Superiority Comparison among Several Swarm Intelligent Algorithms.
To verify the superiority of the ICSO algorithm proposed in this paper, in this section, we compare the performance of seven algorithms, namely, ICSO, CSO, ICSOII proposed in the literature [30] (named ICSOII to distinguish it from the ICSO algorithm), BFA, PSO, AFSA, and GA from the aspects of optimization accuracy and convergence performance.

Te Superiority Comparison in Optimization Accuracy.
To verify the superiority of the ICSO algorithm in terms of optimization accuracy, this section presents the experimental results of the abovemetioned seven algorithms to optimize the CEC2014 function test suite. Te data in Table 4 are the mean values of 30 independent runs of each Computational Intelligence and Neuroscience 9   Computational Intelligence and Neuroscience algorithm on each function. Te bold data in Table 4 are the optimal values. It can be seen from Table 4 that for functions f 1 and f 2 , the ICSO algorithm directly reduces the order of magnitude of optimization accuracy from 5 and 7 to 3 and 2, respectively. In addition, from the number of optimal values that can be found, the number of optimal values that can be found by CSO and GA is 2. AFSA and PSO can fnd 3 and 9 optimal values, respectively. Te optimal value of ICSOII and BFA is both 6. Te number of optimal values that can be found by the ICSO algorithm is 18, which shows the superiority of the ICSO algorithm in optimization accuracy.

Te Superiority Comparison in Convergence
Performance. To verify the superiority of the ICSO algorithm in convergence performance, in this section, we give the average convergence curves of 30 test functions on the CEC2014 function test suite optimized by the abovementioned seven algorithms (each algorithm is independently run 30 times on each test function). Te parameter settings in this section are shown in Section 4.1.2. As mentioned in Section 4.2.2, the ordinates in Figure 5 are the logarithms of the average ftness function values, and subgraphs in some convergence curves are locally magnifed renderings, so as to show the convergence efect of each algorithm more clearly.
It can be seen from Figure 5 that the convergence performance of the ICSO algorithm is the best on functions functions f 2 , f 3 , f 8 , f 9 , and f 20 , the convergence performance between ICSO and PSO algorithms is very close, the performance of the PSO algorithm is slightly better, and they all outperform the performance of the other algorithms. On functions f 13 and f 26 , the convergence performance of ICSO and CSO algorithms is the best, and the performance of the ICSO algorithm is slightly inferior to that of the CSO algorithm, ranking second. On the function f 16 , ICSOII has the best convergence performance and the ICSO algorithm ranks second. Only on functions f 4 , f 12 , f 18 , f 23 , and f 28 ∼f 30 , the convergence performance of the ICSO algorithm is not as good as that of BFA, AFSA, and GA, ranking 3 rd or 4 th . From this, we can see the superiority of the ICSO algorithm proposed in this paper in terms of convergence performance.

Friedman Test of Algorithms.
To compare the performance of various algorithms more reasonably, this section uses the Friedman test to test the performance of the abovementioned 7 algorithms (ICSO, ICSOII, PSO, CSO, GA, BFA, and AFSA) from a statistical point of view. Te Friedman test is a nonparametric test method, which is often used to test the performance of algorithms due to its simple operation and lax requirements on the test data [32,33,36]. For the minimum optimization problem, the smaller the average ranking of the algorithm, the better its performance. Table 5 is the Friedman test results of 7 algorithms on 30 functions.
It can be seen from Table 5 that the average ranking of the ICSO algorithm is 1.90, ranking the highest, 0.87 lower than that of ICSOII, 2.70 lower than that of the CSO    Computational Intelligence and Neuroscience 13 algorithm and 1.38 lower than that of the PSO algorithm, which fully demonstrates the efectiveness of the improved strategy in this paper. To sum up, the reason why the ICSO algorithm is superior to CSO, PSO, and BFA may be that it greatly enhances the ability of the algorithm to jump out of the local extrema by innovative cooperation between the chicken swarm and particle one to achieve information interaction and improves the depth searchability of the algorithm by integrating the replication and elimination-dispersal operations of BFA. Te reason why the ICSO algorithm is superior to ICSOII, GA, and AFSA may be that the ICSO algorithm has a mechanism of subgroup division and multiswarm cooperation based on chicken swarm and particle one, which realizes parallel optimization through group cooperation. Although the ICSOII algorithm has subgroup division, its population cooperation is limited to the cooperation within the chicken swarm, so its optimization ability is weaker than that of the ICSO algorithm.

Experimental Comparison between ICSO and ICSOII.
To further compare the performance between the ICSO algorithm proposed in this paper and the ICSOII algorithm proposed in the literature [30], we set the parameters of the algorithm according to the literature [30] in this section. Te statistical results of the abovementioned two algorithms running 51 times independently on the CEC2014 function test suite are shown in Table 6. Te experimental data of the ICSOII algorithm comes from its corresponding literature.
It can be clearly seen from Table 6 that the number of optimal values obtained by the ICSOII algorithm is 18, and the theoretical optimal values are obtained on 10 functions. While the number of optimal values obtained by the ICSO algorithm is 21, and its theoretical optimal values are obtained on 12 functions.

Experimental Comparison between ICSO and a State-ofthe-Art Algorithm.
To further verify the performance of the ICSO algorithm, DMSDL-QBSA which is a state-of-the-art algorithm proposed in the literature [18] is also used to compare with the ICSO algorithm in this section. In order to make the experimental comparison fairer and more reasonable, the parameter settings of the ICSO algorithm are the same as those of DMSDL-QBSA, that is, the population size is set to 30, and the maximum number of iterations is set to 100000. Other parameter settings can be seen in Section 4.1.2.
Te experimental data including the maximum (Max), minimum (Min), mean (Mean), and variance (Var) values are shown in Table 7, where the optimal results are shown in

Welded Beam Design Problem.
To verify the performance of the ICSO algorithm to solve practical optimization problems, a welded beam design problem is considered, which has been described in detail in the literature [4,37] and [38]. Te problem is a minimum problem, which can be formulated as follows: P � 6000lb, L � 14in., E � 30 × 10 6 psi, G � 12 × 10 6 psi, τ max � 13600 psi, σ max � 30000, δ max � 0.25 in.
Te comparison results of optimal solutions obtained by diferent algorithms are shown in Table 8. Te statistical results are shown in Table 9, where "Worst," "Mean," "Best," and "SD" stand for the worst, mean, best, and standard deviation values obtained by 30 independent runs, respectively. In addition, the optimal results are shown in bold. It is worth noting that the experimental results of comparison algorithms are extracted from their corresponding literature.
For HFPSO [4] and EPSO [37], because the optimal solutions of the four parameters are not given in the literature [4,37], they are not listed in Table 8. It can be seen from Tables 8 and 9, ICSO and MBA [38] have obvious advantages over the other two algorithms in terms of the worst, mean values, standard deviation, etc. Although the stability of the ICSO algorithm is slightly inferior to that of MBA [38], it has a higher optimization accuracy, which preliminarily shows that ICSO can be used to solve the welded beam design problem.

Conclusion and Future Directions
To overcome the premature convergence problem of the standard CSO algorithm when solving complex optimization problems, an ICSO algorithm is proposed in this paper. For the chicks with the weakest optimization ability, we introduce the reproduction and eliminationdispersal operations of BFA to improve the deep searchability of the CSO algorithm. In addition, in order to improve the global convergence speed of the algorithm, the theory of PSO is integrated to construct a hybrid CSO algorithm. Te experimental results show that the ICSO algorithm proposed in this paper can signifcantly improve the optimization accuracy and convergence performance. Te disadvantage of the proposed algorithm is that with the increase of the dimension of the optimization problem, the optimization ability of the algorithm will decrease, which makes it not suitable for dealing with large-scale optimization problems. Terefore, in the future research work, how to dynamically adjust the limited number of role updates in the chicken swarm according to the number of iterations and how to improve the individual position update formula for hens with relatively weak search ability to further improve the optimization ability still need further research. In addition, we will also consider applying the ICSO algorithm to deal with practical problems, such as path planning in logistics distribution, workshop scheduling, and land use forecast [39].

Data Availability
All data generated or analyzed during this study are included in this published article. Color versions of one or more of the fgures in this paper are available from the corresponding authors upon reasonable request.