A Complex-Valued Encoding Multichain Seeker Optimization Algorithm for Engineering Problems

*is article comes up with a complex-valued encoding multichain seeker optimization algorithm (CMSOA) for the engineering optimization problems. *e complex-valued encoding strategy and the multichain strategy are leaded in the seeker optimization algorithm (SOA).*ese strategies enhance the individuals’ diversity, enhance the local search, avert falling into the local optimum, and are the influential global optimization strategies. *is article chooses fifteen benchmark functions, four proportional integral derivative (PID) control parameter models, and six constrained engineering problems to test. According to the experimental results, the CMSOA can be used in the benchmark functions, in the PID control parameter optimization, and in the optimization of constrained engineering problems. Compared to the particle swarm optimization (PSO), simulated annealing based on genetic algorithm (SA_GA), gravitational search algorithm (GSA), sine cosine algorithm (SCA), multiverse optimizer (MVO), and seeker optimization algorithm (SOA), the optimization ability and robustness of the CMSOA are better than those of others algorithms.

However, some optimization algorithms are still not very successful in many optimization problems. e optimization problems include the following: being premature, issues with low optimization precision, having only a local optimal solution, slow convergence speed, and insufficient robustness. To better overcome the issues of common optimization precision, prematurity, having only a local optimal solution, slow convergence rate, and poor robustness, some improved algorithms have proven to be feasible optimization algorithms and have been used in practical engineering. For instance, the evolutionary algorithms are improved by the adaptive parameter control methods [17]. e simulated annealing algorithm based on the particle swarm algorithm is adopted to optimize the extracting multiple tests [18]. A whale optimization algorithm based on hybrid algorithm framework with learning and complementary is used in function optimization and engineering design problems [19]. A multilayered gravitational search algorithm is used in function optimization and real-world problems [20]. An artificial bee colony algorithm search is improved by scalefree networks [21]. e chaotic local search-based differential evolution algorithm is applied to optimize the function optimization and the real-world optimization problems [22].
Also, the complex-valued encoding heuristic algorithms have been proposed according to the characteristics of some algorithms.
ese complex-valued encoding intelligent optimization algorithms have proven to be feasible optimization algorithms and have been used in practical engineering. For instance, the complex-valued encoding dragonfly algorithm optimized the power systems [23]. A gray wolf optimization based on plural encoding optimized the filter model [24]. e complex-valued encoding satin bowerbird optimization algorithm solved the benchmark functions [25]. e complex-valued encoding-driven optimization optimized the 0-1 knapsack problem [26]. e complex-valued encoding symbiotic organism search algorithm was proposed for the overall optimization [27]. e complex-valued encoding flower pollination algorithm optimized the constrained engineering optimization problems [28]. A comprehensive survey was offered for the complex-valued encoding metaheuristic optimization algorithm [24]. Dai et al. proposed SOA in 2006 [29]; the goal is to mimic the seekers' behavior and the way they exchange information and solve practical application optimization problems. Recently, SOA has been used in many fields, such as in unconstrained optimization problems [30], optimal reactive power dispatch [31], a challenging set of benchmark problems [32], the design of a digital filter [33], optimizing parameters of artificial neural networks [34], the optimizing model and structures of fuel cell [35], the novel human group optimizer algorithm [36], and several practical applications [37].
However, in the initial stage of dealing with optimization problems, the SOA converges faster than others. When all individuals are near to the best individual for solving the optimization problem, the individuals will lose diversity and fall into prematurity.
To overcome the shortcomings of the SOA, there are various strategies for improving the SOA, such as the best empirical parameter strategy, the dynamic adaptive Gaussian variation of empirical parameter strategy, the Chebyshev chaos of order three strategy, the real coding double-link strategy, the complex-valued encoding strategy, and the complex-valued encoding multichain strategy. After improving the SOA with the above strategies and making an experimental comparison, this paper selects several improved strategies with better results to improve the SOA together. In this article, complex number coding and a multichain strategy are used to enhance the global optimization and the local search. We propose the complexvalued encoding multichain seeker optimization algorithm (CMSOA). e multichain strategy includes the complexvalued multichain and the stochastic complex multichain strategy. e CMSOA has been tested on fifteen benchmark functions, four PID control parameter optimizations, and six engineering optimizations taken from the literature. In comparison with PSO, SA_GA, GSA, SCA, MVO, and SOA, the CMSOA can find better values to solving the questions, and the precision and robustness of the CMSOA are better. e complex-valued encoding and the multichain methods enhance the diversity of the individuals and avert premature convergence. e CMSOA overcomes the premature convergence of the SOA. e advantages of the CMSOA are summed up as follows: (1) CMSOA is proposed to enhance the precision and robustness of optimization. (2) With the complex-coded multichain strategy, in complex-valued coding, the real part, imaginary part, and real number are used as parallel individual variables to solve the objective function problem. (3) e stochastic multichain strategy is introduced in the SOA. According to the initial solution generation rule of the complex number coding, the real part, the imaginary part, and the real number are randomly generated as the parallel individual variables to solve the objective function. (4) e complex-coded strategy, the multichain strategy, and the stochastic multichain strategy can improve the diversity of individuals, enhance local search, and avert premature convergence.
e rest of the article is organized as follows. Section 2 presents the SOA and the algorithm improvement strategies. Section 3 describes the CMSOA. Section 4 shows the algorithm optimization experiments, the results, and the analyses. At last, Section 5 gives some conclusions.

The Basic SOA and Algorithm
Improvement Strategies e SOA carries out in-depth research on human search behavior. It considers optimization as a search for an optimal solution by a search team in search space, taking search team as population and the site of the searcher as task method. Using "experience gradient" to determine the search direction, we use uncertain reasoning to resolve the search step measurement, through the scout direction and search step size to complete the searchers' position in the search interspace update, to attain the optimization of the solution.

Key Update Points for SOA.
e SOAs have three main updating steps. In this section, i is the ith searcher individual and j represents the individual dimension. s is the total number of individuals; D is the total number of dimensions of the variable; t means the current algebra; and iter max represents the maximum optimization algebra. x ij (t) and x ij (t + 1), respectively, represent the searchers' site at algebras t and (t + 1).

Search Direction.
e forward orientation of a search is defined by the experience gradient obtained from the individuals' movement and the evaluation of other individuals' search historical position.
e egoistic direction f → i,e (t), altruistic direction f e searcher uses the method of a random weighted average to obtain the search orientation.
where best is the historical optimal location in the neighborhood where the ith search factor is located; p i,best is the optimal locality from the ith search factor to the current locality; ψ 1 and ψ 1 are random numbers in [0, 1]; and ω is the weight of inertia.

Search
Step Size.
e SOA refers to the reasoning of the fuzzy approximation ability. e SOA, through the computer language, describes some of the human natural languages that can simulate human intelligence reasoning search behavior. If the algorithm expresses a simple fuzzy rule, it adapts to the best approximation of the objective optimization problems. Greater search step length is more important. However, the smaller fitness corresponds to the smaller search step length.
e Gaussian distribution function is adopted to describe the search step measurement.
whereα andδ are parameters of a membership function. According to equation (3), the probability of the output variable exceeding [− 3δ, 3δ] is less than 0.0111. erefore, μ min � 0.0111. Under normal circumstances, the optimal position of an individual has μ max � 1.0 and the worst place is 0.0111. However, to accelerate the convergence speed and get the optimal individual to have an uncertain step size, μ max is set as 0.9 in this paper. Select the following function as the fuzzy variable with a "small" target function value: where μ ij is determined by equations (4) and (5) and I i is the count of the sequence x i (t) of the current individuals arranged from high to low by function value. e function rand(μ i , 1) is the real number in any partition [μ i , 1]. It can be seen from equation (4) that it simulates the random search behavior of human beings.
Step measurement of j-dimensional search interspace is determined by where δ ij is a parameter of the Gaussian distribution function, which is defined by where ω is the weight of inertia. As the evolutionary algebra increases, ω decreases linearly from 0.9 to 0.1. x → min and x → max are, respectively, the variate of the minimum value and maximum value of the function.

Individual Location Updates.
After obtaining the scout direction and scout step measurement of the individual, the location update is represented by (9) f ij (t) and α ij (t), respectively, represent the searchers' search direction and search step size at time t.

e Algorithm Improvement Strategies.
Five strategies for improving the algorithm are listed in this paper.

2.2.1.
e Best Empirical Parameter Strategy. e first strategy is an empirical parameter change strategy. In the basic SOA, equation (8) is changed to equation (10), and the empirical value C is changed to a fixed empirical value.
rough a large number of experimental tests, the empirical value is C � 0.2. e individual position update is still the same as equation (9).
where δ ij is a parameter of the Gaussian membership function [38,39] and x → min is the variate of the minimum value of the function.

2.2.2.
e Dynamic Adaptive Gaussian Variation of Empirical Parameter. In the SOA, equation (8) is changed to equation (11), and the empirical value C 1 is changed to an adaptive empirical value that varies between 0.1 and 0.5 with the change of optimization algebra according to equation (12). e individual position update is still the same as equation (9).
where δ ij is a parameter of the Gaussian membership function [38,39] and x → min is the variate of the minimum value of the function.

e Chebyshev Chaos of Order ree.
e Chebyshev map of order ω is defined as Scientific Programming where when ω ≥ 2, x ij is chaotic and ergodic and has orthogonality. In this case, no matter how close different initial values are, the sequences derived from multiple iterations are not correlated with each other.

e Multichain Strategy/the Double-Chain
Strategy. e multichain strategy includes taking the real and the imaginary parts of the plural as separate parallel solutions and the randomly generating parallel solutions according to the complex number coding law.
In this paper, the meaning of the multichain strategy is that a single individual variable in the original SOA is converted into six parallel individual parameters when the CMSOA optimizes a problem. In complex-valued coding, there are real part X R , imaginary part X I , and real number X K . In each iterative loop optimization, X R , X I , and X K are adjusted to the variables that meet the scope of X (X min � A k , X max � B k ). X R , X I , and X K were taken as the relative optimal solution variables, respectively, to solve the objective function problem. Secondly, a group of variables that randomly generate X R_Random , X I_Random , and X K_Random according to formulas (9)-(11) and meet the scope of X (X min � A k , X max � B k ) should be added in each cycle optimization and taken as the relative optimal solution variable to solve the objective function, respectively. At the end individual of the single solution, the respective optimal solutions are saved, and the global optimal value is saved as the current optimal value after the comparison of each optimal solution. e optimal solution variables of the next generation of X R , X I , and X K are changed according to formulas (13)- (15). e next generation optimal solution variables of X R_Random , X I_Random , and X K_Random are generated randomly according to formulas (9)- (11). In other words, a single individual variable X in the original SOA is converted to six individual variables X R , X I , X K , X R_Random , X I_Random , and X K_Random when solved by the CMSOA, and this is shown in Figure 1. So, instead of solving for one main chain, we are solving for six parallel chains. A multichain strategy is used in the CMSOA; the strategy adds the variety of the individual, enhances the local scout, and averts premature convergence.
For the SOA of real number coding, the real number coding is one chain, and the random generation of real number population is another chain. So, a double-chain is made up of a real number coding chain and a random generation of real number chain.

e Complex-Valved Encoding
(1) Initial Population Generation. In light of the variable interval [A k , B k ], k � 1, 2, . . ., 2s − 1, 2s, the modules ρ k , the phase angles θ k , and the plural are produced [40] as follows: (2) Individual Location Updates. e real part is updated by where α R represents the scout direction of the real parts, f R is the scout step measurement of the real parts, and X R represents the location of the real number parts. e imaginary part is updated by where α I represents the scout direction of the imaginary part, f I is the scout step measurement of the imaginary part, and X I represents the location of the imaginary number part.
(3) Fitness Evaluation Method. When calculating fitness values using the SOA, we convert plural to real numbers. e formula is as follows.
(1) Take the plural mathematical module as real number: (2) Define the sign according to the phase angle: where X k is the real number.  double-chain. A double-chain represents a chromosome pair, and the individuals that make up the double-chain have the same length. e two-body framework enhances the variety of individuals and makes the algorithm have better searching and calculation capacity. e CMSOA is based on a multiple population evolution model, three populations evolved by the SOA, and three other populations evolved from random generation. e individual groups use the information-sharing mechanisms to realize coevolution. Algorithm 1 shows the primary process of the CMSOA.

Experimental Setup.
e algorithms used in the experiment in this paper were run under MATLAB R2016a. e computer is configured as Intel (R) Core (TM) i7-7500U CPU @2.7 GHz 2.9 GHz processor with 8 GB of memory, and the operating system is Windows 10.

Algorithm Performance Comparison in Benchmark
Functions. To ensure that the comparison of these algorithms is fair, the population number of algorithms is 30, and the evolutionary algebra is 1000. At the same time, for further ensuring the fairness of algorithm comparison and reducing the effect of randomness, the results of the seven algorithms after 30 independent runs were selected for comparison.

e Benchmark Functions.
In this field, it is common to base the capability of algorithms on mathematic functions that are known to be globally optimal. Fifteen benchmark functions in the literature are used as the comparative test platform [7,10,[44][45][46]. Table 1 shows the functions in the experiment. Variables are set to one thousand.

Algorithm Performance Comparison of the SOA with
Different Improvement Methods. In this paper, the SOA is improved by six different methods: the parameter changing SOA (PCSOA), the parameter adaptive Gaussian transform SOA (PAGTSOA), the SOA based on the Chebyshev chaos of order three (CCSOA), the SOA based on real coding double-link (DSOA), the SOA based on complex-valued encoding (CSOA), and the complex-valued encoding multichain seeker optimization algorithm (CMSOA).
(1) Parameter Setting of SOA with Different Improvement Methods. is section will introduce the parameter setting of the improved SOAs used in the experiment in this paper. Dai et al. have done a lot of research on the parameter set of the SOA [33], and we did a lot of practice tests and comparative studies about the parameters. e specific parameters of the improved SOA are shown in Table 2. In the next section, we use these improved SOAs for experimental comparison and choose a relatively optimal improved algorithm to compare with other advanced intelligent algorithms.
(2) Improved Algorithm Performance Comparison in the Benchmark Functions. e SOA is improved in six different ways: the parameter changing SOA (PCSOA), the parameter adaptive Gaussian transform SOA (PAGTSOA), the SOA based on the Chebyshev chaos of order three (CCSOA), the SOA based on real coding double-link (DSOA), the SOA based on complex-valued encoding (CSOA), and the complex-valued encoding multichain seeker optimization algorithm (CMSOA). To test the performance, each improved algorithm was optimized for the fifteen functions in Table 1. Each algorithm and each function were run independently 30 times. e performance of the SOA and the six improved SOAs in fifteen function optimizations was compared by the mean (Mean), standard deviation (Std.), best fitness (Best), the program running time (Time), and the best fitness rank (Rank) of 30 running results. e optimal fitness reflects the optimization accuracy of the algorithm, the average value and standard deviation reflect the robustness of the algorithms, and the running time reflects the time of the program.
e results of functions f 1 -f 15 are displayed in Table 3. e values in bold and italics indicate that the optimal result is better.
Based on Table 3, for the benchmark functions f 1 -f 15 , the comparison between the seven improved SOAs in this paper and the original SOA shows that the optimization result of the CMSOA is the best value. e mean (Mean), standard deviation (Std.), best fitness (Best), and best fitness rank (Rank) of the CMSOA were the best after 30 independent runs. e total program running time of f 1 -f 15 ranks fifth among the seven algorithms compared in this paper. e running time of the CMSOA is longer than that of others algorithms. From the perspective of optimization accuracy and robustness, the CMSOA has the best optimization performance than these improved SOAs in this paper. Section 4.2.3 compares the CMSOA with other intelligent optimization algorithms widely used today.
(3) Search History of the CMSOA. Figure 2 shows the graph of the optimized function f 1 , the convergence curves, the initial population's positions, and the search history; the search history behaviors of the search seekers are marked with red mark +. Based on Figure 2, for the benchmark function f 1 , the convergence curve of the CMSOA is fast. From the search history of the CMSOA, the search seekers of the CMSOA extensively move towards promising search regions in the search space; the search seekers searched the given search space by the moment in the change search step size and different search directions; this gives the way to increase local search, escape local optima, and avoid premature convergence.
Similarly, Figure 3 shows the graph of the optimized function f 10 , the convergence curves, the initial population's positions, and the search history; the search history behaviors of the search seekers are marked with red mark +. Based on Figure 3, for the benchmark function f 10 , the convergence curve of the CMSOA is fast. From the search history of the CMSOA, the search seekers of the CMSOA extensively move towards promising search regions in the search space; the search seekers searched the given search Scientific Programming space by the moment in the change search step size and different search directions; this gives the way to increase local search, escape local optima, and avoid premature convergence.
Similarly, Figure 4 shows the graph of the optimized function f 14 , the convergence curves, the initial population's positions, and the search history; the search history behaviors of the search seekers are marked with red mark +. Based on Figure 4, for the benchmark function f 14 , the convergence curve of the CMSOA is the fast. From the search history of the CMSOA, the search seekers of the CMSOA extensively move towards promising search regions in the search space; the search seekers searched the given search space by the moment in the change search step size and different search directions; this gives the way to increase local search, escape local optima, and avoid premature convergence.

e Algorithm Performance Comparison of Different Algorithms in the Benchmark Functions.
To test the performance of the CMSOA, the CMSOA is compared with the PSO, SA-GA, GSA, SCA, MVO, and SSA, using fifteen benchmark functions [7,10,[44][45][46] in Table 1, which have been widely used in the test.
(1) e Parameter Setting of Different Algorithms. In this section, the parameter set of the PSO [47], SA_GA [48], GSA [6], SCA [8], MVO [9], SOA [29], and CMSOA is presented. According to references [6,8,23,29,47,48], we did a lot of practice tests and comparative studies for the parameter set. e parameters of the seven algorithms depend on the real experience to take the right value. Table 4 lists the parameters in the test.

(2) e Result Comparison of Different Algorithms in Benchmark Functions.
is section uses the same fifteen functions as in Table 1, but we have expanded the dimension of the variables to 1000 dimensions. e mean values, standard deviation, best fitness, and best fitness rank of the algorithms of 30 all-alone runs and the data of optimization results of functions f 1 -f 15 are shown in Table 5. e values in bold and italics indicate that the optimal outcome is better.
For the benchmark functions f 1 -f 15 , based on Table 5, except f 4 , f 7 , f 9 , f 10 , f 11 , f 14 , and f 15 , the optimal value of the CMSOA is better than that of the others. To f 9 , the optimal value of the CMSOA has reached the theoretical best value, although the optimal fitness value of the CMSOA is inferior to that of the PSO and the GSA. For f 7 function, the optimal value of the CMSOA is only worse than that of the PSO (1) t ← 0 (2) Initialization: generate initial species group based on formulas (15)- (17).
(3) Convert plural into real numbers based on formulas (20) and (21). (4) Determine the range of X R _ CMSOA,G , X I_CMSOA,G , and X CMSOA,G to satisfy the range of X. (5) Evaluate each seeker. Compute the fitness. (6) While stopping condition is not satisfied (6.1) Running process of the CMSOA (6.1.1) Renew the real parts by formula (18), Renew the imaginary parts based on formula (19), X I_CMSOA,G .
Determine the range of X R _ CMSOA,G , X I_CMSOA,G , and X CMSOA,G to satisfy the range of X.
Scout strategy giving scout direction and scout range.
Convert complex numbers into real numbers according to formulas (20) and (21).
Determine the X R _ Random,G , X I _ Random,G , and X Random,G to satisfy the range of X.  Scientific Programming Step Scientific Programming  algorithm. e optimal fitness value result of the CMSOA to f 4 is worse than that of the PSO, GSA, and SOA. e optimal fitness value results of the CMSOA for f 15 function are only worse than those of the PSO algorithm, the optimal fitness value result of the CMSOA for f 14 function is worse than that of the PSO and SOA, the optimal fitness value result of the CMSOA for f 11 function is worse only than that of the SCA algorithm, and the result of the CMSOA for f 10 is worse than that of the SOA and the MVO.
For the benchmark functions f 1 -f 15 , based on Table 5, except f 2 , f 3 , f 4 , f 7 , f 9 , f 10 , f 11 , f 12 , and f 14 , the standard deviation results of the CMSOA are better than those of the others. e standard deviation results of the CMSOA for f 9 are only worse than those of the PSO algorithm, the result of the CMSOA for f 7 is worse than that of the PSO, GSA, and SOA, the result of the CMSOA for f 4 is worse than that of the SCA, GSA, SA_GA, PSO, and the MVO, the result of the CMSOA for f 3 is worse than that of the MVO and SOA, and the result of the CMSOA for f 2 is only worse than that of the SOA. To f 2 function, the standard deviation results of the PSO, SA_GA, and the SCA algorithm have no solution, and the GSA and the MVO algorithm have an infinite standard deviation. e CMSOA is better than the others. For f 14 , the CMSOA is worse than the PSO and the GSA. For f 11 and f 12 , the CMSOA is worse than the PSO, SA_GA, GSA, MVO, and SOA, and the standard deviation results of the CMSOA  for f 10 function are worse than those of the PSO, SA_GA, GSA, SCA, and the MVO algorithm.
For the benchmark functions f 1 -f 15 , based on Table 5, except f 3 , f 4 , f 7 , f 9 , f 10 , and f 14 , the mean values of the CMSOA are better than those of the others. For f 9 , the mean test results of the CMSOA have reached the theoretical best value; although the mean test result of the CMSOA is worse than that of the PSO, the result of the CMSOA for f 7 is worse than that of the PSO, the result of the CMSOA for f 4 is worse than that of the PSO, GSA, and SOA, and the result of the CMSOA for f 3 is only worse than that of the GSA and SOA. e CMSOA is better than the others, for f 10 , the CMSOA is worse than the MVO and SOA, and for f 14 , the CMSOA is only worse than the PSO algorithm.
According to the optimal fitness value mean rank and all rank results from Table 5, the CMSOA can find solutions and has strong optimization ability and strong robustness to benchmark function.
(3) Convergence Curve Comparison of Algorithms in the Benchmark Functions. Figure 5 shows the fitness curves of the best fitness for the benchmark functions f 1 -f 15 (D � 1000). As seen from Figure 5, the CMSOA is compared to the other six algorithms; the convergence of the CMSOA is faster, and the precision of the CMSOA is better, except f 4 , f 7 , f 9 , f 10 , f 14 , and f 15 .
Although the CMSOA for f 9 is worse than the PSO in terms of convergence and the precision, the CMSOA has reached the theoretical best value, for f 7 , the CMSOA is only worse than the PSO, and for f 4 , the CMSOA is worse than the PSO, GSA, and SOA. e CMSOA for f 15 is only worse than the SOA in terms of convergence and precision, for f 14 , the CMSOA is worse than the SOA and the PSO, and for f 10 , the CMSOA is worse than the MVO and SOA. Because of the multichain strategy to augment the individuals' diversity and local scout intensity, the CMSOA has better optimization property.

(4) ANOVA Test Comparison of Algorithms in Benchmark
Functions. Figure 6 shows the ANOVA of the global best values for benchmark functions f 1 -f 15 (D � 1000). As seen from Figure 6, the CMSOA is the most robust, except f 3 , f 4 , f 7 , f 10 , f 12 , and f 14 . e ANOVA test results of the CMSOA for f 7 are only worse than those of the PSO algorithm, the result of the CMSOA for f 4 is worse than that of the PSO, GSA, and SOA, and the result of the CMSOA for f 3 function is only worse than that of the GSA and SOA. e ANOVA test results of the erefore, the overall calculational complexity of the CMSOA is almost the same as the basic SOA.

Run Time Comparison of Algorithms in Benchmark
Functions. In this section, we recorded the running time of each algorithm under the same conditions: population number 30, evolution algebra 1000, and 30 independent runs of the above fifteen benchmark functions f 1 -f 15 (D � 1000). en, the running time of the fifteen functions is added to obtain the sum of the 30 independent running times of each algorithm for the fifteen functions listed in this paper and the ranking of the total time, as shown in Table 6. As seen from Table 6, the PSO algorithm has the most minor program running time, followed by the SCA algorithm, which has more minor program running time, and the CMSOA ranks sixth, which has relatively more program running time. At the bottom of the list is the SA_GA algorithm, which takes the most running time.
To learn more about the program running time of the seven algorithms in the fifteen functions, a bar chart (Figure 7) was made for the total time of each algorithm after 30 independent runs. From Figure 7, the program running time of the PSO is the least, while that of the SA_GA algorithm is the most, Scientific Programming and the program running time of the CMSOA is less than half of that of the SA_GA algorithm, which is relatively large.
Xpl% � Div Div max × 100, Xpt% � Div − Div max Div max × 100, where median x j is the median of dimension j in whole swarm, x j i is the dimension j of the swam individual i, n is the size of swarm, Div j is the average for all the individuals, Div is the diversity of swarm in an iteration, Div max is the maximum diversity in all iterations, and Xpl% and Xpt% are the exploration and exploitation percentages for an iteration, respectively. Figure 8 shows the exploration and exploitation abilities of the CMSOA as the number of iterations increases in the benchmark functions f 1 -f 15 . As observed from the plotted curves shown in Figure 8, the CMSOA maintains good balance between the exploration and exploitation ratios as the number of iterations increases.

Performance Profiles of Algorithms in Benchmark
Functions.
e average fitness was selected as the capability index.
e algorithmic capability is expressed in performance profiles, which are calculated by the following formulas: where g represents an algorithm, G is the algorithm set, f represents a function, F is the function set, n g is the number of algorithms in the experiment, n f is the number of functions in the experiment, μ f,g is the average fitness after the algorithm g solves function f, r f,g is the capability ratio, ρ g is the algorithmic capability, and τ is a factor of the best probability [53]. Figure 9 shows the capability ratios of the mean fitness for the seven algorithms on the benchmark functions f 1 -f 15 (D � 1000). e results are displayed by a log scale 2. As shown in Figure 9, the CSMOA has the highest probability. When τ � 1, the CMSOA is about 0.6, which is better than others. When τ � 4, the CMSOA is about 0.87, the PSO is 0.53, the SOA is 0.40, the GSA is 0.067, the MVO is 0.067, the SCA is 0.067, and the SA_GA is 0.067. When τ � 12, the CMSOA is 0.87, the PSO is 0.73, the SOA is 0.80, the GSA is 0.33, the MVO is 0.33, the SCA is 0.27, and the SA_GA is 0.2. e capability curve of the CMSOA lies above others, and the CMSOA can achieve about 0.87 when τ ≥ 4. us, the property of the CMSOA is better than that of other algorithms.

Algorithm Performance Comparison in PID Controller
Parameter Optimization Problems. In this section, we use four test control system models optimizing the PID parameters to test the capability of the CMSOA. For the g1 ∼ g3, the population number of all algorithms is 20, the max number of algebras is 20, the step response time of g1 ∼ g2 is set to 10 s, and the step response time of g3 is set to 30 s. For g4, the population number of all algorithms is 50, the max number of algebras is 50, and the step response time is set at 50 s. (28)-(31) show the test control system models optimizing PID parameters used in our experiment. Figure 10 shows the process diagram for optimizing the test control system PID parameters by the CMSOA. Figure 11 shows the optimization of PID parameter model structure of the test control system.

Result Comparison of Algorithms in PID Controller
Parameter Optimization. For testing the capability of the CMSOA, the CMSOA is compared with the PSO, SA-GA, GSA,   Table 7. e values in bold and italics indicate that the optimal result is better.
For the PID controller parameter optimization problems, according to Table 7, except g3 and g4, in terms of best fitness, the CMSOA is better than others. e optimal fitness value results of the CMSOA for g3 model are only worse than those of the SA_GA algorithm; the optimal fitness value result of the CMSOA for g4 model is only worse than that of the PSO algorithm. Except for g2 and g3, in terms of standard deviation results, the CMSOA is better than others, and the CMSOA is only worse than the SA_GA. In terms of mean, the CMSOA is better than others. According to the optimal fitness value mean rank and all rank results from Table 7, the CMSOA can find solutions and has very strong robustness for the PID controller parameter optimization problems.

e Convergence Curve Comparison of Algorithms in PID Controller
Parameter Optimization. Figure 12 shows the fitness curves of PID controller parameter optimization for g1 ∼ g4. As shown in Figure 12, the CMSOA is compared with the other six algorithms; the convergence of the CMSOA is fast, and the precision of the CMSOA is best. e CMSOA can find the optimal value.   Figure 13 shows the ANOVA of the global best values' PID controller parameter optimization for g1 ∼ g4. As seen from Figure 13, the CMSOA is the most robust compared to other algorithms.

e Unit Step Functions of PID Controller Parameter
Optimization. Figure 14 shows the unit step functions of PID controller parameter optimization for g1 ∼ g4. As seen from Figure 14, by the CMSOA to optimization the unit step models PID controller parameter for g1 ∼ g4, the unit step functions tend to stabilize very quickly and accurately. erefore, the CMSOA is an effective and feasible PID parameter optimization solution for the control system model.

Algorithm Performance Comparison in Constrained
Engineering Optimization Problems. We use six engineering problems to test the capability of the CMSOA further. e engineering problems are very popular in the literature. e penalty function is used to calculate the constrained problem. e parameter set for all of the heuristic algorithms still adopts the parameter setting from Table 4 of Section 4.2.3. e formulations of these problems are available in Appendix.

Welded Beam Design Problem.
is is a least fabrication cost problem, which has four parameters and seven constraints. e parameters of the structural system are shown in Figure 15 [9]. Some of the works come from these kinds of literature: GSA [6], MFO [7], MVO [9], coevolutionary particle swarm optimization (CPSO) [54], and harmony search (HS) [55]. For the problem in this paper, the CMSOA is compared with the PSO, SA_GA, GSA, SCA, MVO, and SOA, and it provides the best-obtained values in Table 8.
In Table 8, the CMSOA is better than the GSA, MFO, MVO, GA, CPSO, and HS algorithms. e CMSOA is also better than the PSO, SA_GA, GSA, SCA, MVO, and SOA. erefore, the CMSOA is an effective and feasible solution to the problem.

Pressure Vessel Design Problem.
is is also a least fabrication cost problem of four parameters and four constraints. e parameters of the structural system are shown in Figure 16 [9]. Some of the works come from the literature: the MFO [7], the evolution strategies (ESs) [56], the differential evolution (DE) [57], the ant colony optimization (ACO) [58], and the GA [59]. For the problem, the CMSOA is compared with the PSO, SA_GA, GSA, SCA, MVO, and SOA, and it provides the best-obtained values in Table 9.
In Table 9, the CMSOA is better than the MFO, ES, DE, ACO, and GA. e CMSOA is also better than the PSO, SA_GA, GSA, SCA, MVO, and SOA. erefore, the CMSOA is an effective and feasible solution to the problem.

Cantilever Beam Design Problem.
is is a problem that is determined by five parameters and is only applied to the scope of the variables of constraints. e parameters of the structural system are shown in Figure 17 [7]. Some of the works come from these kinds of literature: the MFO [7], the cuckoo search algorithm (CS) [60], the generalized convex approximation (GCA) [61], the method of moving asymptotes (MMA) [61], and the symbiotic organism search  (SOS) [62]. For the problem, the CMSOA is compared with the PSO, SA_GA, GSA, SCA, MVO, and SOA, and it provides the best-obtained values in Table 10.
In Table 10, the CMSOA is better than the MFO, CS, GCA, MMA, and SOS. e CMSOA is also better than the PSO, SA_GA, GSA, SCA, MVO, and SOA. erefore, the CMSOA is an effective and feasible solution to the problem.

Gear Train Design Problem.
is is a gear ratio minimization problem, which has four variables and the scope of variables of constraints. Figure 18 shows the schematic diagram [63]. Some of the works come from these kinds of the literature: the MFO [7], the MVO [9], the CS [60], the artificial bee colony (ABC) [64], and the mine blast algorithm (MBA) [64]. In this paper, the CMSOA is compared with the PSO, SA_GA, GSA, SCA, MVO, and SOA, and it provides the best-obtained values in Table 11.
In Table 11, the CMSOA proves to be better than the MFO, MVO, CS, ABC, and MBA. Except for the SA_GA, GSA, and PSO, the CMSOA is better than the SCA, MVO, and SOA. e optimal fitness value of the CMSOA has reached the theoretical best value, although the optimal fitness value of the CMSOA is worse than that of the SA_GA,

ree-Bar Truss Design Problem.
is is a weight minimization problem under stress, which has two variables and only applies to the scope of the variables of constraints. e schematic diagram of the components [63] is shown in Figure 19 [9].
Some of the works come from these kinds of literatures: MFO [7], MVO [9], CS [60], MBA [64], and differential evolution with dynamic stochastic selection (DEDSS) [65]. In this paper, the problem is resolved by the CMSOA. For the problem, the CMSOA is compared with the PSO, SA_GA, GSA, SCA, MVO, and SOA, and it provides the best-obtained values in Table 12.
In Table 12, except MVO and PSO, the CMSOA is better than the others. e optimal fitness value of the CMSOA has reached the theoretical best value, although the optimal fitness value of the CMSOA is worse than that of the MVO and the PSO. erefore, the CMSOA can resolve the problem.

I-Beam Design Problem.
is is a vertical deflection minimization problem that has four variables and a constraint. Figure 20 shows the design diagram [7]. Some of the works come from these kinds of literatures: MFO       Figure 17: Cantilever beam design problem. 28 Scientific Programming [7], CS [60], SOS [62], the adaptive response surface method (ARSM) [66], and the improved adaptive response surface method (IARSM) [66]. For the problem, the CMSOA is compared with the PSO, SA_GA, GSA, SCA, MVO, and SOA, and it provides the best-obtained values in Table 13.
In Table 13, except MFO, GSA, SOA, and SA-GA, the CMSOA is better than the others. e fitness of the MFO is best. Although the most minor vertical deviation of the CMSOA is not as good as that of the GSA, SOA, and SA-GA, it is very close to other relative optimal values. erefore, the CMSOA is an effective and feasible solution to the I-beam design optimization problem.
In brief, the CMSOA proves to be better than the other algorithms in most actual studies. e CMSOA can resolve these practical problems.       Figure 19: ree-bar truss design problem.

Conclusion
A CMSOA is presented, with a complex-valued encoding method and a multichain strategy. e CMSOA is tested in four stages from different perspectives such as benchmark function, PID control parameters, and constraint engineering. Besides, the CMSOA was compared with the PSO, SA-GA, GSA, SCA, MVO, and SOA. In the first phase, the SOA is improved in six different ways: the parameter changing SOA (PCSOA), the parameter adaptive Gaussian transform SOA (PAGTSOA), the SOA based on the Chebyshev chaos of order three (CCSOA), the SOA based on real coding double-link (DSOA), the SOA based on complex-valued encoding (CSOA), and the complex-valued encoding multichain seeker optimization algorithm (CMSOA). Each improved algorithm was optimized for the fifteen functions. e result is that the CMSOA is feasible in the benchmark functions. In this phase, we consider the ranking values of 30 all-alone runs between the CMSOA mean values, standard deviation values, best fitness values, best fitness value ranks, the convergence curves of functions f 1 , f 10 , and f 14 , and the population's positions search history of functions f 1 , f 10 , and f 14 .
In the second phase, fifteen benchmark function optimization problems are used to test the CMSOA further. e CMSOA is compared to the PSO, SA-GA, GSA, SCA, MVO, and SOA for verification. e CMSOA is feasible in the benchmark functions. e second phase is also about the ranking values of 30 all-alone runs between the CMSOA mean values, standard deviation values, best fitness values, best fitness value ranks, convergence curves, and the variance tests for the global minimum values. In the benchmark function optimization problems, the optimal solution curves obtained by the CMSOA are in good agreement with the theoretical optimal solution curves, and the accuracy of the CMSOA is better. e ANOVA of the global best values to benchmark functions is studied, and the CMSOA is the most robust algorithm. Based on the complexity analysis, the CMSOA is known as an efficient algorithm. Based on the run time comparison of seven algorithms in benchmark functions, the CMSOA has relatively more program running time, and it is not optimal in terms of running time. e exploration and exploitation abilities of the CMSOA in benchmark functions are studied, and the CMSOA maintains good balance between the exploration and exploitation abilities as the number of iterations increases. From the results of the performance ratios of the average solution for the seven algorithms, the optimization probability of the CMSOA is the highest.
In the third test phase, four PID control parameter optimization problems are used to test the CMSOA in practice further. e problems were a parameter optimization model of second-order PID controller without time delay, a parameter optimization model of PID controller with first-order microdelay, a parameter optimization model of first-order PID controller with significant time delay, and a parameter optimization model of high-order PID controller without time delay problems. e third test phase also considered the CMSOA mean values, standard deviation values, best fitness values, and best fitness values rank of 30 all alone runs, the convergence curves, and the ANOVA. From the results of PID parameter optimization problems, compared with the other six algorithms, the CMSOA is effective and feasible in the practical problem.
Eventually, in the last test phase, six engineering problems further tested the CMSOA.
e CMSOA was compared with various algorithms. e results prove that the CMSOA is the highest competitive algorithm for the practical optimization problems.
According to the comparative analysis of the experiments, the conclusion is as follows: (i) We use the complex-valued encoding and the multichain strategy for each seeker to increase the scout region and avoid convergence to local optimality.

Data Availability
e data used to support the findings of this study are available from the corresponding author upon request.

Conflicts of Interest
e authors declare that there are no conflicts of interest regarding the publication of this paper. Scientific Programming 33