Research Article An Adaptive Fuzzy Chicken Swarm Optimization Algorithm

The chicken swarm optimization (CSO) algorithm is a new swarm intelligence optimization (SIO) algorithm and has been widely used in many engineering domains. However, there are two apparent problems with the CSO algorithm, i.e., slow convergence speed and diﬃcult to achieve global optimal solutions. Aiming at attacking these two problems of CSO, in this paper, we propose an adaptive fuzzy chicken swarm optimization (FCSO) algorithm. The proposed FCSO uses the fuzzy system to adaptively adjust the number of chickens and random factors of the CSO algorithm and achieves an optimal balance of exploitation and exploration capabilities of the algorithm. We integrate the cosine function into the FCSO to compute the position update of roosters and improve the convergence speed. We compare the FCSO with eight commonly used, state-of-the-art SIO algorithms in terms of performance in both low- and high-dimensional spaces. We also verify the FCSO algorithm with the nonparametric statistical Friedman test. The results of the experiments on the 30 black-box optimization benchmarking (BBOB) functions demonstrate that our FCSO outperforms the other SIO algorithms in both convergence speed and optimization accuracy. In order to further test the applicability of the FCSO algorithm, we apply it to four typical engineering problems with constraints on the optimization processes. The results show that the FCSO achieves better optimization accuracy over the standard CSO algorithm.


Introduction
ere are many optimization problems in scientific and engineering domains [1][2][3][4], and traditional optimization methods (TOMs) [5], such as gradient descent method and Newton's method, cannot solve these optimization problems very well because of the following three reasons. (1) Most of the TOMs require that the objective function be convex and continuously differentiable. ese strict requirements determine that the TOMs cannot be applied widely. (2) Due to their low calculation efficiency, it is difficult for the TOMs to deal with the NP-hard problem. (3) It is also difficult for the TOMs to avoid falling into the trap of local optimal solutions. Compared to the TOMs, the swarm intelligence optimization (SIO) algorithms (SIOAs) can handle the NPhard problem beautifully owing to these obvious advantages: (1) flexibility: the whole population can adapt to the changing environment quickly. (2) Robustness: even if a few individuals cannot work, the whole population can still work normally. (3) Self-organization: the whole population requires relatively little supervision or top-down control. (4) High efficiency: the SIOAs pay more attention to the improvement of the optimized efficiency, which can acquire the satisfied optimal or suboptimal solutions in an acceptable time.
In last decades, many classic SIOAs have been proposed, such as genetic algorithm (GA) [6], particle swarm optimization (PSO) algorithm [7], and artificial fish school (AFS) algorithm [8]. Among others, the chicken swarm optimization (CSO) algorithm [9] is a new SIO method proposed in 2014, which simulates the social relationship structure and foraging behaviour in a chicken swarm. Many variants of the CSO algorithm have been proposed recently aiming for bettering. Wu et al. [10] proved the convergence of CSO using Markov chain and gave an improved version of the CSO algorithm, which modifies the update formula of chicks to learn from both their mother hens and rooster in their group. Qu et al. [11] used adaptive distribution to replace the Gaussian distribution in the update formula of rooster in order to balance the global and local searching abilities. Li et al. [12] introduced several improved factors learned from both the grey wolf optimizer (GWO) and the particle swarm optimization (PSO) to extend the searching capability. Li et al.'s method also integrates a duplicate remove operator to enhance the diversity of the chicken population. Torabi and Safi-Esfahani [13] combined CSO with an improved raven roosting optimization (RRO) algorithm in order to balance the global and local searching capabilities. Fu et al. [14] modified the update equation of roosters and introduced a mutation operator to solve the problem of easily falling into a trap of local optimal solutions. Furthermore, the CSO algorithm has also been extended to solve the constraint optimization problem [15] (see Section 6.1), 0-1 knapsack problem [16], and multiobjective optimization problem [17].
Although the research work of CSO has made some progress, however, for CSO, like the other SIOAs, these two obvious drawbacks still exist: easily falling into local optimum and slow convergence speed [11]. One important reason causing the aforementioned drawbacks is that the CSO algorithm cannot adaptively adjust the parameters according to the current optimization results. Generally speaking, it is almost impossible to precisely adjust parameters of the SIOAs following different population situations, but we can observe or detect a rough trend of how the parameters should be adjusted in different situations. For example, the more the times of iteration are, the smaller the parameters should become, and vice versa. Hence, it would be interesting if we can use relative fuzzy concepts such as "faster" iterations or "smaller" parameters to describe the parameter adjustment process of the CSO algorithm. It is known to us that the fuzzy system [18] is a powerful tool to imitate the thinking of the human brain to recognize and judge the fuzzy phenomena. In consequence, we propose a fuzzy chicken swarm optimization (FCSO) algorithm to incorporate the fuzzy system mechanism into the algorithm which is able to adaptively adjust the parameters of CSO and thus to overcome the problems of local optimum and slow convergence. Our contributions discussed in this paper are briefly summarized as follows: (1) To our knowledge, the fuzzy system has been introduced to and applied in the CSO algorithm for the first time, which dynamically adjusts the number of chickens and random factors of CSO according to optimization speed, chicken aggregation, and iteration times, in order to adaptively balance the exploitation and exploration abilities (2) We improve the position update strategy of roosters to overcome the randomness of the original, traditional strategy and make the local exploration more accurate with the increasing number of iteration times (3) rough modifying the design of the defuzzifier, we enhance the defuzzification method of the fuzzy system incorporated in the FCSO algorithm to make the algorithm more flexible (4) By evaluation of the key evaluation criteria, convergence speed, and optimization accuracy, the proposed algorithm outperforms the eight other state-of-the-art, prominent SIOAs in the high-and low-dimensional spaces, which provides a comprehensive and in-depth analysis of these prominent algorithms e rest of this paper is organized as follows. e basic CSO algorithm is described in Section 2. In Section 3, we discuss the principle of the proposed FCSO algorithm, and the experiment process, the test data used, and evaluation criteria are discussed in Section 4. We also present comprehensive experimental results and their analysis. In Section 5, Friedman and Nemenyi tests are performed to validate the significance of the proposed method against other counterparts. Four classical engineering design problems are discussed in Section 6, which shows the optimization ability of the FCSO algorithm in practical engineering problems. We conclude the paper in Section 7, indicating our contributions to this research area and our future work in this direction.

Basic CSO Algorithm
In the basic CSO algorithm, there are three kinds of roles, roosters, hens, and chicks, each having different behaviour specifications. In the following, we give basic assumptions for the CSO algorithm: (1) e CSO algorithm divides a chicken swarm into a few groups, each of which has one rooster, several hens, and a small number of chicks. (2) e identities of roosters, hens, and chicks are determined by their fitness values, the best ones are selected as roosters, the worst ones are the chicks, and other individuals are the hens. Each hen randomly chooses one rooster as her mate and becomes a member of his group, and each chick also randomly selects one hen as its mother. (3) In the whole population, the individual identities, the spouse relationships, and the mother-children relationships remain unchanged for G generations (G is the iterative cycle), and the identities, the spouse relationships, and the mother-children relationships will be updated after G generations. (4) In each group of the whole population, hens follow their spouse rooster to find foods, and they will randomly compete for foods with other individuals within a group. e individuals with better fitness values are more likely to obtain foods.
Each chicken is described by its position. Let RN, HN, CN, and MN represent the number of roosters, hens, chicks, and mother hens, respectively, and x t i,j is the position of the 2 Mathematical Problems in Engineering i th chicken in the j th dimensional space on the t th iteration, where i ∈ 1, . . . , N { }, j ∈ 1, . . . , D { }, and t ∈ 1, . . . , T { } and N, D, and T represent the total number of chickens, the dimension number, and the maximum iteration times, respectively. A rooster, a hen, and a chick have their specific position update formulas.
For a rooster, its recurrent position is defined as follows: Here, Randn(0, σ 2 ) is a random number following Gaussian distribution with an expectation of zero and variance of σ 2 , ε is a very small constant, k is the number of another rooster which is chosen randomly, and f i and f k are the fitness values of the i th and the k th roosters, respectively. e recurrent position of a hen is defined as follows: Here, C 1 and C 2 are the learning factors, Rand is a random number following uniform distribution in the scope of [0, 1], r 1 is the index of the rooster that is the spouse of the i th hen, r 2 is the number of a rooster or a hen which is selected randomly, and r 1 ≠ r 2 . e recurrent position of a chick is defined as follows: where x t m,j represents the mother hen of chicks and FL is a random factor in the scope of [0, 2]. So, the basic CSO algorithm is shown in Algorithm 1.

The Principle of the FCSO Algorithm
A major principle of applying the fuzzy method is to apply the fuzzy concepts to the adjustment of the parameters of the CSO algorithm, which aims to intuitively reflect the human knowledge about the manipulation of the parameters to achieve a better convergence and accuracy of the algorithm. In other words, first, we fuzzify the parameter values to allow a knowledge-based judgement of how to modify the parameters.
en, we use the knowledge library of the parameter scales to determine which values to choose and execute the adjustment of the parameters. Finally, we defuzzify the fuzzy values for the parameters and continue running the algorithm. e proposed FCSO algorithm adopts the fuzzy system to adaptively adjust parameters of CSO under different population situations. e parameters include random factors Rand in formula (3), FL in formula (6), and the number of chickens of the whole population N.
is dynamic mechanism, applying the fuzzy method to adaptively adjust CSO's parameters in an iterative process, aims to overcome the defects (discussed in Section 1) of CSO to some extent. e key concepts that describe the proposed method and algorithm are presented in this section. After providing an overview of the framework, including the fuzzy system, of the algorithm in Section 3.1, we define and discuss in detail the key input variables for the fuzzy system in Section 3.2. In Sections 3.3 and 3.4, we describe two key mechanisms of the fuzzy system: one is the fuzzifiers and defuzzyfiers, and the other is the rule base and inference engine. Finally, we present the proposed FCSO algorithm in Section 3.5.

Overall Description of the FCSO Algorithm.
e FCSO algorithm adopts the fuzzy system to adjust the number of chickens and random factors adaptively, and the adopted fuzzy system consists of four components [18], including a fuzzifier, a defuzzifier, a fuzzy rule base, and a fuzzy inference engine. e input variables (discussed in Section 3.2) to the fuzzy system are optimization speed, iteration times, and chicken aggregation, which are fuzzified with the Gaussian fuzzifier. e fuzzy rule base is composed of fuzzy IF-THEN rules, and the fuzzy inference engine (FIE) completes reasoning from input variables to output variables according to the fuzzy IF-THEN rules. e input variables should be fuzzified by the fuzzifier before reasoning, and the output variables need to be defuzzified by the defuzzifier after reasoning. e input variables (discussed further in Section 3.2) are extracted to monitor the running status of the proposed algorithm, including optimization speed, the aggregation of chickens, and iteration times. e output variables (discussed in Section 3.3) are the random factors Rand and FL and the number of chickens N. ese output variables are transmitted to CSO algorithm to control its running, and the updated monitoring indicators (or input variables) are input into the fuzzy system iteratively to adaptively adjust the CSO parameters in order to overcome the shortcomings of CSO algorithm; the framework of FCSO is described in Figure 1. e ◊ symbol in Figure 1 indicates that the input variables nv and nt can derive the output variable Rand and FL through the fuzzy system, and the ○ symbol indicates that nv and aggr can adjust the output variable N; the details are described in Section 3.4.

Input Variables of the Fuzzy System.
According to the principle of the CSO algorithm, random factors Rand and FL and the number of chickens N are important factors which can influence CSO's global searching ability and convergence speed. In order to adjust these parameters, some monitoring indicators of CSO algorithm need be extracted as the input variables of the fuzzy system. Generally speaking, if the fitness values (suppose minimum Mathematical Problems in Engineering 3 optimization problem) of all the chickens are optimized too fast, it indicates that the whole population may fall into local optimum, so optimization speed is defined as one indicator, described in the following equations: nv � 1 where G is the iterative cycle (see basic assumption (3) of CSO in Section 2), N is the total number of chickens, v G represents the average value of optimization speed in one iterative period G, f l−1 j and f l j are the fitness values of the j th chicken in the (l − 1) th and l th iterations among G, respectively, and nv is the normalization of v G by sigmoid function.
e purpose of adding a constant 10 to the sigmoid function [19] is to make nv evenly distributed in [0, 1]. So, too large optimization speed may make CSO fall into the local optimal solution, and too small speed means that CSO converges slowly. Another monitoring indicator is the aggregation of chickens, which is defined in equation (11): naggr � aggr AGGR , aggr < AGGR, 1, aggr ≥ AGGR, where x i � (x i,1 , x i,2 , . . . , x i,D ) is the i th chicken's position, μ � (μ 1 , μ 2 , . . . , μ D ) is the centroid of positions of N chickens, aggr represents the average distance between x i and μ, and AGGR is the initial average distance between x i and μ in the initialization step. Generally speaking, the initial positions should cover the whole solution space as much as possible, so AGGR should be larger than aggr, and naggr is the normalization of aggr, as described in equation (11). If aggr is too small, it indicates that the chicken swarm is too concentrated, which would weaken the global searching ability of the CSO algorithm. On the contrary, if aggr is too large, it means that the chicken population is too scattered, which would lead to slow convergence speed. e third indicator is the recurrent iteration times t. If t is large, it means chickens' positions need to be updated exquisitely at the late stage of the CSO algorithm. e normalization of t, nt, is defined as

Fuzzifiers and Defuzzifiers of the Fuzzy System.
In this paper, all the three input variables are fuzzified by the Gaussian fuzzifier into three fuzzy sets, respectively, including high, medium, and low. For optimization speeds, the membership functions of low, medium, and high are defined as follows: Here, σ 2 is the variance, and in this work, σ � 0.04. e parameters p1 and p2 are two intersecting points of the three Gaussian curves with μ being 0, 0.5, and 1, respectively, as  Mathematical Problems in Engineering described in Figure 2.
e two parameters also apply to equations (16) to (21).
Similarly, the membership functions of low, medium, and high for the aggregation of chickens are defined as follows: Following are the membership functions of iteration times nt: For the output variables of the fuzzy system, in order to expand the position updating span, the random factors Rand and FL have the same value range [0, 3] in the FCSO algorithm, and the number of chickens N can select 0.7∼2 times the initial number of chickens. For the aforementioned three output variables, too few fuzzy sets affect the accuracy of optimization, but too many sets would lead to complex rule mapping that would affect algorithm efficiency. We quantified them into 3, 5, and 9 fuzzy sets, respectively, and the experimented results showed that the effect of taking five fuzzy sets was a suitable solution, which include very low, low, medium, high, and very high. Rand is fuzzified into five fuzzy sets according to the following Gaussian membership functions, FL adopts the similar membership functions as Rand, and no more explanation in this article.

Mathematical Problems in Engineering
Here, σ 2 is the variance of rand with the value σ � 0.04, and p1, p2, p3, and p4 are the intersections among five Gaussian curves when μ is 0, 0.25, 0.5, 0.75, and 1, respectively, as described in Figure 3.
In this work, we fuzzified the times of the initial number of chickens into five fuzzy sets according to the following membership functions. e current number of chickens can be calculated through multiplying defuzzified n by the initial number of chickens.
where σ 2 is the variance of the chicken number and the parameters σ, μ, p1, p2, p3, and p4 have the same settings as Rand, as described in Figure 4. For the defuzzification of output variables Rand, FL, and N, this work adopts the center average defuzzifier (CADefz) [18], which is formally defined as follows: where M represents the number of fuzzy sets, . . , h M , the center and the height of the l th fuzzy set, respectively. We use the following equation (33) to identify the center of the fuzzy set in the standard CADefz: Here, μ FIEout (v) is the membership function derived from the output of the fuzzy inference engine (FIE). e value of c i is randomly selected from the set i α , and we use the following equation (34) to identify h:   Mathematical Problems in Engineering Here, h i is defined as the maximum value of μ i (ov i ) where ov i is within the scope of i α . In this work, the threshold value α in equation (33) equals 0.5 for Rand and FL, and for the number of chickens N, it is set to 0.7, approximate to the membership values of p1p2, p3, and p4.

Fuzzy Rule Base and Fuzzy Inference Engine.
ere are two functions that the fuzzy system performs: the first (as discussed in Section 3.4.1) to use the optimization speed nv and iteration times nt to deduce random factors Rand and FL and the second (as discussed in Section 3.4.2) to use naggr and nt to derive the number of chickens N.

Fuzzy System for Adjusting Random Factors.
e random factors Rand and FL in the basic CSO algorithm are completely random. On the one hand, at the early stage of running the CSO algorithm, too small random factors indicate that each moving step of the position update (see equations (3) and (6)) is very tiny, which makes it easy for the algorithm to fall into the local optimum and damage the capability of the algorithm on global exploration. On the other hand, at the later stage, improper random factors (for example, large random factors) will lead to too large moving steps, which will hinder the local refinement and affect the convergence speed. us, we attempt to balance the global and local searching abilities through adaptively adjusting random factors, which change according to judging optimization speed nv and current iteration times nt, as described in Figure 1.
When the FCSO algorithm runs on the early stage of iteration (small iteration times), it should have a wide searching scope in order to find the global optimum as much as possible. us, according to equations (3) and (6), the random factors should have a large range, which map to the fuzzy sets of very high, high, and medium correspondingly. On the contrary, when the FCSO algorithm runs on the later stage of iteration (large iteration times), it should focus on local refined exploration, so the scope of random factors should concentrate on a small range, which map to the fuzzy sets of very low, low, and medium. Similarly, according to equation (7), low optimization speed means the slow convergence, which generally indicates that the FCSO algorithm falls into the local optimal solution, enlarging the random factors in equations (3) and (6) can help FCSO jump to the local optimum, so Rand and FL map to the fuzzy sets of very high, high, and medium. On the contrary, high optimization speed means the rapid convergence, and narrowing random factors can help FCSO to exquisitely explore the global optimum, so they map to the fuzzy sets of very low, low, and medium. When iteration times and optimization speed are medium, respectively, random factors map to the fuzzy sets of low, medium, and high. According to the above discussion, we derive the following mapping rules in Table 1.
e fuzzy rule base is composed of fuzzy IF-THEN rules, each of which is formulated as IF < FP1 > THEN < FP2>, where FP1 and FP2 are fuzzy propositions. e rule base of adjusting random factors contains a set of rules as in the following equation: Here, R (r) rf represents the whole rule set, and R r rf is the r th rule, r � 1, 2, . . . , 9. For example, for the random factor Rand, the first rule R 1 rf can be as follows: IF < nt is high and nv is high > THEN < Rand is very low>, as in Table 1.

Fuzzy System for Adjusting the Number of Chickens.
e number of chickens in the basic CSO algorithm is changeless, while in this work, we adopt the fuzzy system to dynamically change the number of chickens according to optimization speed and chicken aggregation that are defined in Section 3.2. Low optimization speed means slow convergence. In order not to stuck in a local optimum, we should lift the population diversity through increasing the chicken number nt, so we map the chicken number to three subsets of the fuzzy set: very high, high, and medium. On the contrary, high optimization speed indicates rapid convergence. Reducing the chicken number would alleviate this trend, so it maps to the fuzzy sets of very low, low, and medium for the number of chickens. For the chicken aggregation, high aggregation means that individuals are too concentrated, which is easy to lead to the local optimal solution. It is necessary to increase the number of chickens to maintain the individual diversity, so when chicken density is high, the number of chickens maps to the fuzzy sets of very high, high, and medium. On the contrary, low aggregation indicates that we can decrease the number of chickens to enhance the local exploration, so the number of chickens maps to the fuzzy sets of very low, low, and medium. A medium optimization speed or aggregation maps to the fuzzy sets of the number of chickens, including high, medium, and low, respectively.

Mathematical Problems in Engineering
According to the above discussion, the fuzzy rule base of adjusting the number of chickens is defined as follows: R (r) n represents the whole rule set, and R r n is the r th rule, r � 1, 2, . . . , 9. For example, R 1 n can be defined as follows: IF < nv is high and naggr is high > THEN < N is medium>, as described in Table 2. In the fuzzy inference engines FIE rf and FIE n , the fuzzy logic principle is used to combine the fuzzy IF-THEN rules in R (r) rf or R (r) n into a mapping from the input variables to the output variables. In the FCSO algorithm, the type of fuzzy inference engine is Mamdani [18], one of the most commonly used inference engine, which defines the fuzzy implication relation by calculating the Cartesian product (taking the smallest value) of two fuzzy sets.

e FCSO Algorithm.
Roosters play an important role in the CSO algorithm that will affect the optimization results of hens and chicks. In the basic CSO algorithm, the position update of a rooster is of great randomness, which has a strong influence on the convergence speed of the CSO algorithm. CSO is easy to fall into a local optimum in the late iterations [15]. In this paper, we adopt the cosine function to adjust a rooster position, and the periodic variation of the cosine function will affect the updating step length of a rooster's position in the later iteration stage, which can help CSO jump out of the local optimal solutions. e modified method is defined as follows: e update range of a rooster position will be narrowed with the increase of iterations according to the trend of the cosine function, which is conducive to the convergence of the CSO algorithm in the later running stage. e FCSO algorithm is given in Algorithm 2.

Experimental Results and Analysis
In this section, we first introduce the experimental environment and then make a comprehensive comparison on optimization accuracy and convergence speed. We consider a comprehensive comparison of the proposed FCSO with other eight related algorithms along two directions: fundamental biointelligence algorithms and biointelligent algorithms equipped with fuzzy systems. One direction is to compare the FCSO with four fundamental biointelligence algorithms, including the basic CSO algorithm [4], genetic algorithm (GA) [1], particle swarm optimization (PSO) algorithm [2], and artificial fish school (AFS) algorithm [3]. e other direction is to compare the FCSO with four biointelligence algorithms which are equipped with the fuzzy logic mechanism, including the fuzzy GA (FGA) algorithm [20], fuzzy artificial fish school (FAF) algorithm [21], fuzzy particle swarm optimization (FPSO) algorithm [22], and the improved CSO (ICSO) algorithm [10].

Description of the Experimental Environment.
For the fairness of comparison, each algorithm will run 10 times independently, and the number of particles of each algorithm is set to 50, and iteration times are set to 500 for 10dimensional space and 1000 for 100-dimensional space, respectively. e parameters of the compared algorithms are consistent with the corresponding original work [1-4, 10, 19-21] in order to ensure the best performance of each algorithm; they are described in Table 3. In the experiments, the algorithms are compared from two qualities: accuracy and convergence speed. In terms of accuracy, we compute the fitness values of minimum (MIN), maximum (MAX), mean (MEAN), and standard deviation (STD) on 10 times independent experiments owing to the randomness of single experiment's result. For convergence speed, we also count the average value of the best fitness under the corresponding iterative number in 10 independent experiments. All the 9 compared algorithms are implemented by Python programming language, and the software and hardware environments are described as follows: CPU is Intel (R) Core (TM) i7-4800MQ @2.70 GHz, memory is 16.00 GB, the operating system is Windows 10 64 bit, the development platform is PyCharm, and the interpreter version of Python is Python3.7.

Description of Benchmark Functions.
In this work, 30 black-box optimization benchmarking (BBOB) functions are adopted as fitness functions which are proposed in the 2017 IEEE Congress on Evolutionary Computation (CEC 2017) [23]. e optimum values of the 30 BBOB functions (F0∼F29) are from 100 to 3000 with the step of 100, and they include unimodal functions, multimodal functions, hybrid functions, and composition functions. ese BBOB functions are adopted to compare the accuracy and convergence   N, T, RN, HN, CN, MN, G, and α Output: the best optimized solution Steps: (1) Initialize positions for chickens randomly.
(2) Compute the fitness value for each chicken, select the global best position of the population and the local best position of every chicken, and initialize iteration times t � 1. (1) Compute nv, nt, and naggr according to equations (8), (11), and (12) (2) Operate fuzzification on nv, nt, naggr, Rand, FL, and N according to equations (13)∼(31) (3) Operate fuzzy inference according to fuzzy rule bases R (r) rf in equation (35) and R (r) n in equation (36) Figures S1∼F30 and F31∼F60 show the comparison results of convergence speed on lowand high-dimensional spaces, respectively. Tables A1 and A2 of Supplementary Material A are  concluded in Table 4; the bold numbers represent the winners of SIOAs according to the specific criterion, and the corresponding winning functions are shown in the following brackets. As described in Table 4, for the 10-dimensional space, FCSO achieves remarkably better results than the other eight SIOAs. Specifically, FCSO obtained 15, 17, 14, and 9 best results on the MAX, MIN, MEAN, and STD criteria, respectively. e second best algorithm is FAFS, and the corresponding results are 7, 6, 8, and 4, respectively; those values of FGA are 5, 4, 4, and 1, respectively, and the other six algorithms have no advantages in the 10-dimensional space. In the 100-dimensional space, FCSO performs better than other eight algorithms. It has obtained 12, 11, 12, and 10 best results on the MAX, MIN, MEAN, and STD criteria, respectively. e corresponding results of FGA are 7, 4, 6, and 5, respectively, and the best results of FAFS are 4, 8, and 5 on the MAX, MIN, and MEAN criteria, respectively; other six algorithms still have no advantages on the 100dimensional space. In addition, it seems that fuzzy-based SIOAs obtained better results than other compared SIOAs, which indicated that the mechanism of fuzzy control can adaptively adjust parameters for complex optimization functions.

Results' Comparison on Accuracy and Stability. e results in
In detail, in the 10-dimensional space, FCSO simultaneously obtained the best results on 12 functions (F1, F4, F6, F7, F13, F14, F18, F20, F22, F23, F24, and F26) on the MAX, MIN, and MEAN criteria. For the second best algorithm FAFS, it only obtained the corresponding best results on 4 functions (F0, F12, F17, and F29), and those of FGA are 3 functions (F3, F8, and F19). e above results indicate that FCSO has the best accuracy compared with other eight algorithms in the low-dimensional space. In addition, in the 10-dimensional space, FCSO and CSO have obvious advantages on the stability than other seven algorithms because they obtained 9 and 6 best results on the STD criterion, respectively. In the 100-dimensional space, FCSO simultaneously obtained the best results on 11 functions (F2, F5, F6,  F7, F8, F13, F15, F19, F20, F24, and F27) on the MAX, MIN, and MEAN criteria, the second best algorithm is FAFS, and it only obtained best results on 3 functions (F0, F9, and F21) on the MAX, MIN, and MEAN criteria simultaneously. FCSO also obtained the best results of 10 functions on the STD criterion; it has obvious advantage than other eight algorithms, which indicates that, in the 100-dimensional space, FCSO has the best stability than other eight algorithms. Based on the above analysis of experimental results, we can conclude that FCSO has obvious advantages on accuracy compared with other eight state-of-the-art SIOAs.

Efficiency Comparison and Analysis.
In terms of optimization efficiency, the convergence curves of nine SIOAs in the 10-and 100-dimensional spaces are described in Figures S1∼S30 and S31∼S60 of Supplementary Material B, respectively. In the 10-dimensional space, FCSO acquired the fast optimization efficiency except functions F23 and F27, which has obvious advantages than other eight SIOAs. GA, AFS, and CSO have slow convergence speed on most functions. In the 100-dimensional space, FCSO achieved the fast optimization efficiency on all the functions, while GA, FGA, AFS, and FAFS have the slow speed on almost all the functions. In addition, the convergence speed of CSO in the high-dimensional space is more faster than that in the lowdimensional space, which indicates that CSO-based algorithms are suitable to solve high-dimensional problems. Although FGA and FAFS perform well in the 10-dimensional space, they have slow convergence speed in the 100dimensional space. e proposed FCSO algorithm has the fast optimization efficiency in both 10-and 100-dimensional spaces on almost all the functions.

Empirical Analysis of Experimental Results.
According to the analysis in Sections 4.3.1 and 4.3.2, we can conclude that the FCSO algorithm has obvious advantages in the criteria of accuracy, stability, and optimization efficiency compared with other eight SIOAs. For the accuracy, the basic CSO algorithm is no better than other SIOAs, while the proposed FCSO performs very well because the fuzzy system (see Section 3) has been incorporated into the CSO algorithm to dynamically tune the parameters in order to adaptively balance the exploitation and exploration abilities. Compared with the parameters of PSO, AFS, CSO, GA, and ICSO, adopting the fixed empirical constants, the accuracy of FCSO, FGA, and FAFS has obvious advantages owing to that the fuzzy system can dynamically adjust the parameters according to optimization problems. FCSO adopts the grouping and regrouping mechanisms, where roosters, hens, and chicks have different evaluation strategies, and the identities of individuals will be changed periodically, so it is easier to obtain better solutions than FGA and FAFS that have the relatively single and static evolutionary strategies. For the stability, FCSO obtained the best results on the STD criterion in both 10-and 100-dimensional spaces; part of the reason lies in intrinsic learning mechanisms of CSO (CSO obtained the best stability except FCSO), including the diverse learning strategies of chicken swarm and the grouping and regrouping mechanism. Another important reason is the mechanism of adaptive parameter tuning in FCSO that can avoid the influence of random position initialization. For the optimization efficiency, as described in Section 3.5, the cosine function has been adopted to adjust rooster positions that play an important influence on the evolutionary process of CSO. e update step length of a rooster position will be narrowed with the increase of iterations according to the trend of the cosine function, which is conducive to the convergence of the CSO algorithm in the later running stage.

Time Complexity Analysis for Compared Algorithms.
As described in Section 2, N, D, and T represent the number of chickens, the dimension number, and the maximum iteration times, respectively. Suppose T FZ represents the time complexity of the fuzzy system for all the compared fuzzy-based algorithms, the time complexity of each compared algorithm is concluded in Table 5. From Table 5

Statistical Tests for Algorithm Comparison
In this work, we consider two statistical tests, Friedman test [24] and Nemenyi test [24]. A Friedman test is constructed to analyse the performance of the compared SIOAs. Table 6 provides the Friedman statistics F F and the corresponding critical value in terms of each evaluation criterion. As shown in Table 6, the null hypothesis (that all of the compared algorithms will perform equivalently) was clearly rejected for each evaluation criterion at a significance level of α � 0.05 for the experimental results in both 10-and 100-dimensional spaces. Consequently, we proceed to conduct a post hoc test [24] in order to analyse the relative performance among the compared SIOAs. e Nemenyi test [24] is used to test whether each of the SIOAs performed competitively against the other compared SIOAs in both 10-and 100-dimensional spaces, including the proposed FCSO. In the test, the performance between two SIOAs was considered to be significantly different if the corresponding average ranks differed by at least the critical difference CD � q α ���������� � k(k + 1)/6N. For the test, q α is equal to 3.102 at the significance level α � 0.05, and thus, CD takes the value of 2.1934 (k � 9 and N � 30). Figure 5 shows the CD diagrams for each of the four evaluation criteria about the experimental results of the 10-dimensional space. If any compared SIOA whose average rank was within one CD to that of FCSO, it would be connected to FCSO with a red line, as described in Figure 5. e algorithms that were unconnected to FCSO were considered to have a significantly different performance between them. In MAX (Figure 5(a)), for example, the average rank for FCSO was 1.9333, and the critical value would be 4.1267 by adding CD. Since PSO, GA, FPSO, CSO, FGA, AFS, and FAFS obtained 6.6667, 6.5, 6.2, 5.6, 5.0667, 4.8, and 4.1333 for their respect average rankings, they were significantly worse compared with FCSO. From Table 5, we can see that FCSO obtained the best average ranks on all four criteria and has obvious better performance than the other SIOAs. Figure 6 shows the CD diagrams for each of the four evaluation criteria about the experimental results of the 100dimensional space. FCSO obtained the best average rank on the MAX and STD criteria and ranked the second and third best average ranks on the MEAN and MIN criteria, respectively. Although FAFS obtained the best average ranks on the MIN and MEAN criteria, its results are not stable  (F1, F4, F5, F6, F7, F13,  F14, F15, F16, F18, F20,  F22, F23, F24, F26) 17 (F1, F2, F4, F6, F7, F10,  F13, F14, F18, F20, F21,  F22, F23, F24, F25, F26,  F27) 14 (F1, F4, F5, F6, F7, F13,  F14, F16, F18, F20, F22  (only ranked the fifth on the STD criterion), but FCSO is the most stable algorithm in both 10-and 100-dimensional spaces.

Optimization on Engineering Problems
e CSO algorithm has been widely applied in various scientific and engineering domains. In the wireless sensor network (WSN) field, Fouad et al. [25] used it to optimize the topological structure of the WSN, and Al Shayokh and Shin [26] and Yu et al. [27] applied the CSO algorithm to handle the localization problem of the WSN. In the cloud computing domain, CSO has been used to solve deadlock-free migration of the virtual machine (VM) [28] and dynamic task scheduling problem [29]. In the military field, CSO is applied to optimize the warhead's reentry trajectory [30] and ascent trajectory of hypersonic vehicles [31]. In the data mining domain, it is used to optimize the K-means clustering algorithm [32], adaptive neurofuzzy inference system [33], and feature selection problem [34]. Still, more fields have CSO been applied to, including architecture [35], transport [36], mechanical engineering [37], environmental protection [38], power [39], and robot [40].
Many real-world engineering optimization problems (EOPs) are very complex in nature and have many constrained conditions. It is difficult for the traditional    optimization methods (TOMs) to deal with the objective functions and the constraints with multiple or sharp peaks due to their unstability [1]. In contrast, SIOAs outperform TOMs in solving the EOPs. In this section, we apply the proposed FCSO algorithm to four types of constrained EOPs to further evaluate the performance of FCSO. ese engineering optimization problems are often used to test the performance of SIOAs [11,15]. In the next, we first introduce these four engineering designs and their constraints and then present the experiment results of applying the CSO and FCSO to these problems and make a brief comparison of their performance.

Description of Engineering Optimization Problems.
In this section, the four types of classic engineering design problems are discussed, including tension/compression spring [1], pressure vessel design [2], three-bar truss design [3], and cantilevered beam design [4]. e description of the four engineering optimization problems is shown in Figure 7.
Problem 1. (Tension/compression spring design). e design, as shown in Figure 7(a), aims to minimize the weight of a tension/compression spring. In this design, the constraints, required in the design, include the minimum deflection, shear stress, surge frequency, and the limit of outer diameter. ere are three designed variables: the average coil diameter x 1 , the wire diameter x 2 , and the number of active coils x 3 , which together define the following constraints (named as F30): Problem 2. (Pressure vessel design). e aim of the design of a cylindrical vessel which is capped at both ends by hemispherical heads, as shown in Figure 7(b), is to solve a hybrid constrained optimization problem, that is, to minimize the costs of welding, materials, and forming. e four variables to describe this problem are the thickness of the cylindrical skin (T s ), the thickness of the spherical head (T h ), the inner radius (R), and the length of the cylindrical segment of the vessel (L). It is required that the thickness of the cylinder can only be taken as an integral multiple of 0.0625. is problem (named F31) is defined as formula (39), and the above four variables are represented by x 1 , x 2 , x 3 , and x 4 , respectively.
Problem 3. ( ree-bar truss design). e objective of this design is to optimize the volume of a statistically loaded three-bar truss. e three-bar truss is subjected to vertical and horizontal forces, and the volume of the three-bar truss is subjected to tress constraints. As shown in Figure 7(c), the lengths of the bars A1 (or A3) and A2 are denoted as x 1 and x 2 in the following formula (40), which is used to define the problem (named as F32): where 0 ≤ x 1 ≤ 1, 0 ≤ x 2 ≤ 1, l � 100 cm, P � 2 kN/cm 2 , and σ � 2 kN/cm 2 .
Problem 4. (Cantilevered beam design). e design problem of the cantilevered beam is described in Figure 5(d), and its goal is to determine the best combination of five different cross-section areas to minimize the volume of the cantilever beam. Each cross section has a width and a height, and in total, there are ten variables, denoted as h i and b i (i � 1, . . . , 5), respectively, see Figure 7(d). e constraints on the design are briefly presented as follows. e maximum allowable stress at the left end of each part is σ max � 14000 N/cm 2 , the material modulus of elasticity is E � 200 GPa, the length of each section l i (i � 1, . . . , 5) is 100 cm, the maximum allowable deflection is y max � 2.715 cm, and the height-to-width ratio of each cross section is restricted to less than 20. is optimization problem (named as F33) can be defined as follows:   Average rank (d) Figure 6: Comparison of FCSO (control algorithm) against other compared algorithms using the Nemenyi test for the experimental results in the 100-dimensional space.

Experimental Results' Analysis of Engineering Optimization Problems.
Previously, we presented the constraint settings for the problems of the four types of engineering designs, which can be seen as to seek for an optimal solution to the problems of F30, F31, F32, and F33. Because FCSO and CSO are proposed to handle unconstrained optimization problems, for the above four constrained engineering problems, after each evolution operation, the individuals in FCSO and CSO will judge whether the solutions satisfy the constraints of engineering problems. If not, they need to be re-evolved until the constraints are satisfied. e settings for the experiment environment are the same as being described in Section 4.1. In this section, we only discuss the comparison results of the performance of the algorithms of CSO and FSCO. Four criteria, including MAX, MIN, MEAN, and STD, are selected for performance comparison, and their results are derived from 10 independent experiments, which can comprehensively reflect the optimization accuracy and stability. As shown in Table 7, the FCSO algorithm has obvious advantage over the CSO algorithm for all the four engineering problems on all the four criteria, so FCSO performs well in four engineering problems than the CSO algorithm.

Conclusions and Future Work
As having been already demonstrated, SIOAs possesses a great advantage in solving NP-hard problems [28]. As a newly proposed SIO algorithm, CSO has been widely used in different applications owing to its good characteristics in recent years, for example, high optimization accuracy [9]. However, generally speaking, the CSO algorithms inherently suffer from slow convergence speed causing a long execution time and easily falling into the local optimum, which make it hard to produce a general optimal solution. In this paper, we propose a fuzzy chicken swarm optimization method, FCSO, to improve the normal CSO algorithms by applying the fuzzy system method to the adjustment of the chicken number and random factors adaptively. e major contributions of this paper to the field of swarm intelligence include the following three aspects. Firstly, the fuzzy system has been introduced to the CSO algorithm for the first time to adaptively adjust random factors and the number of chickens to address the three   monitoring indicative issues, optimization speed, chicken aggregation, and iteration times. e introduced fuzzy system makes an adaptive balance between the exploration and exploitation capabilities in the CSO algorithm to overcome the two major drawbacks. Secondly, the cosine function has been integrated into the solution to the position update of roosters that can mitigate the randomness of the original strategy and make the local exploration more accurate with the increase of iteration times. irdly, we design a collection of comprehensive experiments in both low-and high-dimensional spaces with the nine algorithms on 30 BBOB functions to show that the proposed FCSO has advantages in accuracy and convergence speed. Owing to the obvious advantages of FCSO, it can be applied to handle the constrained (as described in Section 6) and unconstrained optimization problems, especially for the problems requiring high stability and rapid convergence speed. Although FCSO achieved better results than other SIOAs in both 10and 100-dimensional spaces, it performed not very well on some composition functions (for example, F26, F28, and F29) in the 100-dimensional space, which indicates that the fuzzy system needs to be further improved to adapt highdimensional complex optimization problems.
In the future, there are efforts required to improve and extend the proposed method and algorithm. First, in this work, we only integrate one kind of cosine function into rooster position update, cosine functions with different periods and other trigonometric functions should be studied in depth in order to more clearly determine the timing and extent of adjusting step length for different optimization problems, and proper step length can enhance optimization accuracy and convergence. Second, the membership functions are constructed by experience in this work, and the influence of different membership functions on the results of FCSO needs further discussion. For different optimization problems, it is necessary to study how to choose the most suitable membership functions, which can improve the performance of the fuzzy system. ird, this work mainly focuses on adaptively adjusting parameters of CSO algorithm, the learning strategies of FCSO can be further improved by hybridizing other SIOAs, such as differential evolution (DE) algorithm [41,42], and the proposed FCSO can be applied to more practical optimization problems [43,44].

Data Availability
All the test data used and the generated results are included within the manuscript.

Conflicts of Interest
e authors declare that they have no conflicts of interest.