An Elastic Collision Seeker Optimization Algorithm for Optimization Constrained Engineering Problems

To improve the seeker optimization algorithm (SOA), an elastic collision seeker optimization algorithm (ECSOA) was proposed. )e ECSOA evolves some individuals in three situations: completely elastic collision, completely inelastic collision, and noncompletely elastic collision. )ese strategies enhance the individuals’ diversity and avert falling into the local optimum. )e ECSOA is compared with the particle swarm optimization (PSO), the simulated annealing and genetic algorithm (SA_GA), the gravitational search algorithm (GSA), the sine cosine algorithm (SCA), the multiverse optimizer (MVO), and the seeker optimization algorithm (SOA); then, fifteen benchmark functions, four PID control parameter models, and six constrained engineering optimization problems were selected for the experiment. According to the experimental results, the ECSOA can be used in the benchmark functions, the PID control parameter optimization, and the optimization constrained engineering problems. )e optimization ability and robustness of ECSOA are better.

However, some optimization algorithms are still not very successful in optimization problems. e optimization problems include issues with low optimization precision, being premature, having only a local optimal solution, slow convergence speed, and insufficient robustness. To better overcome the issues of optimization precision, prematurity, having only a local optimal solution, slow convergence rate, and poor robustness, some improved algorithms have proven to be feasible optimization algorithms and have been used in practical engineering. For instance, the Harris hawks optimization algorithm, salp swarm algorithm, grasshopper optimization algorithm, and dragonfly algorithm are used for the structural design optimization of vehicle components [18]. e adaptive inertia weight factor in the traditional PSO optimizes path planning [19]. e PSO based on Gaussian and quantum behavior optimizes constrained engineering problems [20]. e least squares support vector machines based on Gaussian are proposed [21]. A Levy flights discrete bat algorithm is adopted to solve the Euclidean traveling salesman problem [22]. e cuckoo optimization algorithm in reverse logistics is used to design a network for COVID-19 waste management [23]. A chaotic cuckoo optimization algorithm based on a Levy flight, backward learning, and interfere operator is used to classify the optimal feature subspace [24]. An elite symbiotic organisms search algorithm with mutually beneficial factor is adopted to optimize the functions [25]. An artificial bee colony with dynamic Cauchy mutation is adopted to solve feature selection [26]. A new elastic collision optimization algorithm is applied in sensor cloud resource scheduling [27]. Dai et al. proposed the SOA in 2006 [28]; the goal is to mimic the seekers' behavior and the way they exchange information and solve practical application optimization problems. In the recent decade, the SOA has been used in many fields, such as unconstrained optimization problems [29], optimal reactive power dispatch [30], challenging set of benchmark problems [31], design of a digital filter [32], optimizing parameters of artificial neural networks [33], optimizing model and structures of fuel cell [34], novel human group optimizer algorithm [35], and several practical applications [36]. However, in the initial stage of dealing with optimization problems, SOA converges faster than others. When all individuals are near the best individual for solving the optimization problem, the individuals will lose diversity and fall into prematurity.
In this article, we propose an elastic collision seeker optimization algorithm (ECSOA), which evolves some individuals in three situations: complete elastic collision, complete inelastic collision, and incomplete elastic collision. ese strategies enhance the individuals' diversity and avert premature convergence. e ECSOA is compared to seven improved SOAs, such as the changing algorithm parameters, the adaptive transformation of empirical value parameters, the Levy motion of some individuals, the reverse learning, the addition of mutual benefit factor, and the Cauchy mutation. is article chose fifteen benchmark functions to test. According to the experimental results, the convergence speed and accuracy of ECSOA are higher. e improved strategy enables the SOA to maintain the individuals' diversity, avert falling into the local optimum, and make up for the shortcomings of SOA in the aspect of easy precocity. Finally, compared with PSO, SA_GA, GSA, SCA, MVO, and SOA, the ECSOA has been implemented and tested on a complete set of well-known fifteen benchmark functions, four PID control parameter optimization models, and six optimization constrained engineering problems taken from literature. According to the experimental results, ECSOA is feasible in the benchmark functions, the PID parameter optimization problems, and the constrained engineering optimization problems. e ECSOA can find better values for solving the questions. e improved SOA successfully overcomes its tendency to prematurely converge to local optima for problems. e ECSOA has better optimization performance and robustness. e algorithm also has an improvement over the original SOA. e advantages of the ECSOA are summed up as follows: (1) An ECSOA is raised to enhance the precision and robustness of the optimization process.
(2) e elastic collision strategies, the completely elastic collision, the completely inelastic collision, and the non-complete elastic collision, can improve the diversity of individuals, enhance local search, and avert premature convergence. e rest of the article structure is as follows. Section 2 presents the SOA and the algorithm improvement strategies. Section 3 describes the ECSOA. Section 4 shows the algorithm optimization experiments, the results, and the analyses. Lastly, Section 5 gives some conclusions.

Basic SOA and Algorithm
Improvement Strategies e SOA carries out in-depth search mimicking human search behavior. It considers optimization as a search for an optimal solution by a search team in search space, taking the search team as population and the site of the searcher as task method. Using "experience gradient" to determine the search direction, we use uncertain reasoning to resolve the search step measurement, through the scout direction and search step size to complete the searchers' position in the search interspace update, to attain the optimization of the solution.

Key Update Points for SOA.
SOAs have three main updating steps.

Search Direction.
e forward orientation of search is defined by the experience gradient obtained from the individuals' movement and the evaluation of other individuals' search historical position.
e egoistic direction f → i,e (t), altruistic direction f → i,a (t), and preemptive direction f → i,p (t) of the ith individual in any dimension can be obtained.
(1) e searcher uses the method of a random weighted average to obtain the search orientation. where is the historical optimal location in the neighborhood where the ith search factor is located; p i,best p is the optimal locality from the ith search factor to the current locality; ψ 1 and ψ 1 are random numbers in [0, 1]; and ω is the weight of inertia.

Search
Step Size.
e SOA refers to the reasoning of the fuzzy approximation ability. e SOA, through the computer language, describes some of the human natural languages that can simulate human intelligence reasoning search behavior. If the algorithm expresses a simple fuzzy rule, it adapts to the best approximation of the objective optimization problems. e greater search step length is more important. However, the smaller fitness corresponds to the smaller search step length.
e Gaussian distribution function is adopted to describe the search step measurement. 2 Mathematical Problems in Engineering where α and δ are parameters of a membership function. According to (3), the probability of the output variable exceeding [− 3δ, 3δ] is less than 0.0111. erefore, µ min � 0.0111. Under normal circumstances, the optimal position of an individual has µ max � 1.0, and the worst place is 0.0111. However, to accelerate the convergence speed and get the optimal individual to have an uncertain step size, µ max is set as 0.9 in this paper. Select the following function as the fuzzy variable with a "small" target function value: where µ ij is determined by (4) and (5), I i is the count of the sequence x i (t) of the current individuals arranged from high to low by function value, and the function rand(µ i ,1) is the real number in any partition [µ i , 1]. It can be seen that (4) simulates the random search behavior of human beings.
Step measurement of j-dimensional search interspace is determined by the following equation: where δ ij is a parameter of the Gaussian distribution function, which is defined by where ω is the weight of inertia. As the evolutionary algebra increases, ω decreases linearly from 0.9 to 0.1. x → min and x → max are, respectively, the variate of the minimum value and maximum value of the function.

Individual Location Updates.
After obtaining the scout direction and scout step measurement of the individual, the location update is represented by (8).
i is the ith searcher individual; j represents the individual dimension; f ij (t) and α ij (t), respectively, represent the searchers' search direction and search step size at time t; and x ij (t) and x ij (t + 1), respectively, represent the searchers' site at time t and (t + 1).

Algorithm Improvement Strategies.
Six strategies for improving the algorithm are listed in this paper.

Dynamic Adaptive Gaussian Variation of Empirical
Parameters. In the SOA, (8) is changed to (10), and the empirical value C 1 is changed to an adaptive empirical value that varies between 0.1 and 0.5 with the change of optimization algebra according to (11). e individual position update is still the same as (9).
where i represents the ith individual, j represents the individual dimension, δ ij is a parameter of the Gaussian membership function [20,21], t means the current algebra, iter max represents the maximum optimization algebra, and d represents the dimension of the optimized object.

e Levy Movement.
A Levy movement [22,24] is a random searching path alternating between short and occasionally long walks following the Levy distribution. e position update equation of the Levy motion is as follows: i represents the ith individual and j represents the number of individuals. Γ(β) � (β − 1)!. t is the current algebra. d is the dimension of the optimized object. r 1 , r 2 ∈ rand (0, 1). β is the partial real constant, which is 1.5 in this paper. After judging whether the fitness value is good or bad based on the newly generated individual position vector in (14), the original individual will be replaced by the best.  Mathematical Problems in Engineering refraction principle [23,24]. e value of the boundary point in the refraction reverse learning is (a + b)/2 of the search interval [a, b]. As shown in Figure 1, the calculation of sinα and sinβ is shown in (13) and (14).
When it is applied to the SOA, the probability of mutation is 0.8. e individual positions are taken for the refraction reverse learning according to (19) to get the new individual positions. In the formula, i is the ith individual, and j is the individual dimension. After judging whether the fitness value is good or bad based on the newly generated individual position vector in (19), the original individual is replaced by the best one.

e Mutually Beneficial Factor.
e individuals x h were randomly selected, and x m was determined by (20) to determine the mutually beneficial factor C [25].
where i represents the ith individual, j represents the individual dimension, ψ represents a random number in (0,1), x gbest represents the j-dimensional component of the current optimal position of the entire population, C is the mutual benefit factor, R is the benefit parameter, and 1 or 2 is randomly selected. After judging whether the fitness value is good or bad based on the newly generated individual position vector in (21), the original individual is replaced by the best one.

2.2.5.
e Cauchy Variation. In this paper, the Cauchy inverse can mutate the population under certain probability.
e Cauchy inverse function [26] is shown in (22). Referring to (22), we can write the new position of the individual as (23); that is, the new position of the individual is obtained by the Cauchy mutation. After judging whether the fitness value is good or bad based on the newly generated individual position vector in (23), the original individual is replaced by the best one.
where F − 1 is the Cauchy inverse function and r 1 and r 2 are random values within [0, 1].

Elastic Collision Variation.
For the individual x ij (x ij is a solution distributed in the solution space of the optimization problem and can be abstractly represented as a unit mass object at a certain position in the space), δ � {x′}(x′ ∈ P (t) ∧ x′≠x ij ); x ij and x′ move in each other's direction at the velocities f(x ij ) and f(x′), respectively; and x ij and x′ will collide at ∆t, and then after ∆t, x i reaches the new position x i,new . e derivation is as follows. For the complete elasticity (CE) collision, according to the law of conservation of momentum and energy [27],

Mathematical Problems in Engineering
Similarly, for the complete inelastic (CI) collisions and the non-complete elastic (NCE) collisions, e individual updating mechanism is as follows: where i represents the ith individual, j represents the individual dimension, ε ∈ (0,1) and ξ ∈ (0, 0.5) are newer systems, G ij (t) is the optimal solution of x ij (t) history, and B ij (t) is the optimal solution of species. r 1 ∈ (0, 1), r 2 ∈ (0, 1), and α ∈ (0, 0.5) are the random numbers. After judging whether the fitness value is good or bad based on the newly generated individual position vector in (27), the original individual is replaced by the best one.

ECSOA
e ECSOA evolves some individuals in the CE collision, the CI collision, and the NCE collision to improve the diversity of individuals and boost partial scouting. Algorithm 1 is the primary process of the ECSOA.

Experimental Setup.
e algorithms used in the experiment in this paper were running under MATLAB R2016a. e computer is configured as Intel® Core ™ i7-7500U CPU @2.7 GHz 2.9 GHz processor with 8 GB of memory, Windows 10 operating system.

Algorithm Performance Comparison in Benchmark
Functions. To ensure that the comparison of these algorithms is fair, the population number of algorithms is 30, and the evolutionary algebra is 1000. At the same time, for further ensuring the fairness of algorithm comparison and reducing the effect of randomness, the results of the seven algorithms after 30 independent runs were selected for comparison.

Benchmark Functions.
In this field, it is common to base the capability of algorithms on mathematic functions that are known to be globally optimal. Fifteen benchmark functions in the literature are used as the comparative test platform [7,10,[37][38][39]. Table 1 shows the functions in the experiment. Variables are set to one hundred.

Performance
Comparison of SOA with Different Improvement Methods. In this paper, the SOA is improved by seven different methods: the parameter changing SOA (PCSOA), the parameter adaptive Gaussian transform SOA (PAGTSOA), the SOA based on Levy variation (LVSOA), the SOA based on refraction reverse learning mechanism (RRLSOA), the SOA based on mutually beneficial factor strategy (MBFSOA), the SOA based on Cauchy variation (CVSOA), and the elastic collision seeker optimization algorithm (ECSOA).
(1) Parameter Setting of SOA with Different Improvement Methods. is section will introduce the parameter setting of the improved SOAs used in the experiment in this paper. Dai et al. have done a lot of research on the parameter set of the SOA [32], and we did a lot of practice tests and comparative studies about the parameters. e specific parameters of the improved SOA are shown in Table 2. In the next section, we will use these improved algorithms for experimental comparison and choose a relatively optimal improved algorithm to compare it with other advanced intelligent algorithms.
(2) Improved Algorithms' Performance Comparison in Benchmark Functions. e SOA is improved in seven different ways: the SOA based on parameter change (PCSOA), Mathematical Problems in Engineering the SOA based on parameter adaptive Gaussian transform (PAGTSOA), the SOA based on Levy variation (LVSOA), the SOA based on refraction reverse learning mechanism (RRLOOA), the SOA based on mutual benefit factor strategy (MBFSOA), the SOA based on Cauchy variation (CVSOA), and the SOA based on elastic collision (ECSOA). To test the performance, each improved algorithm was optimized for the fifteen functions in Table 1. Each algorithm and each function were run independently 30 times. e performance of the SOA and the seven improved SOAs in fifteen-function optimization was compared in terms of the mean (Mean), standard deviation (Std.), best fitness (Best), program running time (Time), and best fitness rank (Rank) of 30 running results. e optimal fitness reflects the optimization accuracy of the algorithm, the average value and standard deviation reflect the robustness of the algorithms, and the running time reflects the time of the program. e results of the functions f1-f15 are displayed in Table 3. e boldface indicates that the optimal result is better.
Based on Table 3, for the benchmark functions f1-f15, the comparison between the seven improved SOAs in this paper and the original SOA shows that the optimization result of the ECSOA is the best value. e mean (Mean), standard deviation (Std.), best fitness (Best), and best fitness rank (Rank) of the ECSOA were the best after 30 independent runs. e f1-f15 total program running time (Time) rank is the fourth among all the eight algorithms compared in this paper. e running time of the ECSOA is longer than that of the SOA, PCSOA, and PAGTSOA; it is shorter than that of the LVSOA, MBFSOA, CVSOA, and PAGTSOA. From the perspective of optimization accuracy and robustness, the ECSOA has the best optimization performance among the improved SOAs in this paper. Section 4.2.3 will compare the ECSOA with the other intelligent optimization algorithms that are widely used at present.

Performance Comparison of Different Algorithms in Benchmark Functions.
To test the performance of the ECSOA, it is compared to the PSO, SA_GA, GSA, SCA, MVO, and SSA, using the fifteen benchmark functions [7,10,[37][38][39] in Table 1, which have been widely used in the test.
(2) e Results Comparison of Different Algorithms in Benchmark Functions. e mean values, standard deviation, best fitness, and best fitness rank of the algorithms of 30 independent runs and the data of functions f1-f15 optimization results are shown in Table 5. e boldface indicates that the optimal outcome is better.
Based on Table 5, for the best value of the benchmark functions, the standard deviation, and the mean, the ECSOA is better than the others. According to the optimal fitness (1) t � 0 (2) Parameter initialization.
(3) Population initialization. Generate an initial species group. (4) Evaluate each seeker. Compute the fitness. Determine the optimal solution P best,G . (5) While the stopping condition is not satisfied. (5.1)Running process of the ECSOA (1) e search direction of the searcher is generated according to (2) (2) e search step size is generated according to (6) (3) Generate a new position x ECSOA,G according to (9), and the range of x ECSOA,G is judged and modified to meet (x min , x max ).
(4) Calculate the fitness and judge the optimal solution.
(1) if rand < P m , the elastic collision variation was carried out on some new positions, according to (26), to obtain new x ECSOA,G , and the range of x ECSOA,G is judged and modified to meet (x min , x max ).
(Other improvement strategies, such as the empirical value parameter adaptive transformation formula (10), the refraction reverse learning formula (14), the Levy variation formula (19), the introduction of mutually beneficial factor formula (21), and the Cauchy variation formula (23), were updated according to the corresponding formula.) (2) Calculate the fitness and judge the optimal solution.
if  Mathematical Problems in Engineering Step

* 100
Rastrigin Mathematical Problems in Engineering   Table 5, the ECSOA has a strong optimization ability and strong robustness to a benchmark function. Figure 2 shows the fitness curves of the best values for the benchmark functions f1-f15 (D � 100). As seen from Figure 2, the convergence of the ECSOA is faster, and the precision of the ECSOA is better. Figure 3 is the ANOVA for the benchmark functions f1-f15 (D � 100). As seen from Figure 3, the ECSOA showed better robustness and improved SOA. erefore, the ECSOA is a feasible solution in the optimization of benchmark functions.

Complexity Analysis.
e calculational complexity of the SOA is O (NDM), N represents the total individual count, D represents the dimension count, and M represents the maximum count of algebras. e computational
SOA [28] e maximum membership degree value: U max � 0.95, the minimum membership degree value: U min � 0.0111, the maximum inertia weight value: W max � 0.9, the minimum inertia weight value: W min � 0.1. ( e same as Table 2.) ECSOA e maximum membership degree value: U max � 0.95, the minimum membership degree value: U min � 0.0111, the maximum inertia weight value: W max � 0.9, the minimum inertia weight value: W min � 0.1, the empirical value: W � 0.2, the elastic collision probability: P e � 0.8. ( e same as Table 2.)  [42], if the count of algebras is high (M ≫ N, D), the calculational complexity is O (NDM). erefore, the overall calculational complexity of the ECSOA is almost the same as the basic SOA.

Statistical Testing of Algorithms in Benchmark
Functions. Using Wilcoxon's rank-sum test [43], we can discover the important differences between the two algorithms. is test gives the value p < 0.05. Table 6 indicates the results of statistical testing. N/A represents the best algorithm. From Table 6, ECSOA is suitable for the fifteen functions. erefore, the ECSOA is better than the other algorithms.

Run Time Comparison of Algorithms in Benchmark
Functions. In this subsection, the running time of the algorithms for each function is recorded under the same conditions: population number of 30, evolution algebra of 1000, and 30 independent runs of the above fifteen benchmark functions f1-f15 (d � 100). en, the running time of the fifteen functions is added to obtain the sum of the 30 independent running times of each algorithm for the fifteen functions listed in this paper and the ranking of the total time, as shown in Table 7. As seen from Table 7, the SCA has the most minor program running time, followed by the PSO algorithm, which has more program running time.
e ECSOA ranks fifth, which has a relatively longer program running time. At the bottom of the list is the SA_GA, which takes the most running time.
To learn more traits about the program running time of the seven algorithms in the fifteen functions, a bar chart in Figure 4 was made for the total time of each algorithm after 30 independent runs. From Figure 4, as to the running time, the ECSOA is less than the SA_GA and GSA; the SCA is the least; the SA_GA is the most; the ECSOA is less than onesixth of SA_GA; and the ECSOA is nearly four times the SCA, which is relatively large.

Performance Profiles of Algorithms in Benchmark
Functions.
e average fitness was selected as the capability index. e algorithmic capability is expressed in performance profiles, which is calculated by the following formulas: where g represents an algorithm; G is the algorithms set; f means a function; F represents the function set; n g represents the count of algorithms in the experiment; n f is the number of functions in the experiment; µ f,g is the average fitness obtained by the algorithm g after solving function f, r f,g is the capability ratio; ρ g is the algorithmic capability; and τ is a factor of the best probability [44]. Figure 5 shows the capability ratios of the average value for the seven algorithms on the benchmark functions f1-f15 (D � 100). e consequences are revealed by a log scale 2. As   shown in Figure 5, the ECSOA has the highest probability. When τ � 1, the ECSOA is about 0.8, which is better than that of the others. When τ � 8, the ECSOA is the winner on the given test functions, ESOA is 1, PSO is 0.67, SA_GA is 0.067, SCA is 0.3, GSA is 0.3, MVO is 0.2, and SOA is 0.33. Regarding the performance curve, the ECSOA is the best; the ECSOA can achieve 100% when τ ≥1. us, the performance of the ECSOA is better than that of the other algorithms.

Algorithm Performance Comparison in PID Controller
Parameter Optimization Problems. In this subsection, we use four control system optimizing PID parameter models to test the capability of the ECSOA. For g1-g3, the population number of all algorithms is 20, the max number of algebras is 20, g1-g2 step response time is set to 10s, and g3 step response time is set to 30s. For g4, the population number of all algorithms is 50, the max number of algebras is 50, the step response time is set to 50s. (33) show the test control system models optimizing PID parameters used in our experiment. Figure 6 shows the process diagram for optimizing the test control system PID parameters by the ECSOA. Figure 7 shows the optimization PID parameter model structure of the control system.       Table 8. e boldface indicates that the optimal result is better.

Results Comparison of Algorithms in the PID
For the PID controller parameter optimization problems, according to Table 8, except g3 and g4, as to the best fitness, the ECSOA is better than the others. e optimal fitness value result of the ECSOA for g3 model is only worse than the SA_GA, the optimal fitness value result of the ECSOA for g4 model is only worse than the PSO algorithm. As to the standard deviation results, for g1 model, the ECSOA is only worse than the SA_GA, SCA, and the MVO; for g2 and g3 models, the ECSOA is only worse than the SA_GA; and for g4 model, the ECSOA is only worse than the MVO. Except for g1 and g4, as to the mean test results, the ECSOA is better than the others; for g1 model, the ECSOA is only worse than the SCA; and for g4 model, the ECSOA is only worse than the MVO. According to the optimal fitness value mean rank and all rank results from Table 8, the  ECSOA can find solutions and has very strong robustness for the PID controller parameter optimization problems.

Convergence Curves Comparison of Algorithms in PID Controller
Parameter Optimization. Figure 8 shows the fitness curves of PID controller parameter optimization for g1-g4. e comparison between the seven algorithms in Figure 8 shows that the convergence of the ECSOA is fast and the precision of the ECSOA is the best. e ECSOA can find the optimal value.

ANOVA Tests Comparison of Algorithms in PID
Controller Parameter Optimization. Figure 9 is the ANOVA of the global best values PID controller parameter optimization for g1-g4. As seen from Figure 9, ECSOA is the most robust algorithm.

Unit
Step Function PID Controller Parameter Optimization. Figure 10 shows the unit step function PID controller parameter optimization for g1-g4. As seen from Figure 10, the ECSOA is used to optimize the unit step function PID controller parameters of g1-g4, and the unit step functions tend to stabilize very quickly and accurately. erefore, the ECSOA is an effective and feasible solution in the control system models optimizing PID parameters.

18
Mathematical Problems in Engineering

Algorithm Performance Comparison in Constrained
Engineering Optimization Problems. We are using six constrained engineering problems to test the capability of the ECSOA further. ese constrained engineering problems are very popular in the literature. e penalty function is used to calculate the constrained problem. e parameters set for all of the heuristic algorithms still adopt the parameter setting in Table 4 of section 4.2.3. e formulations of these problems are available in the appendix.

Welded Beam Design Problem.
is is a least fabrication cost problem, which has four parameters and seven constraints. e parameters of the structural system are shown in Figure 11 [7]. Some of the algorithms are taken from other literature as follows: GSA [6], MFO [7], MVO [9], CPSO [45], and HS [46]. For the problem in this paper, the ECSOA is compared to the PSO, SA_GA, GSA, SCA, MVO, and SOA and provides the best-obtained values in Table 9.
In Table 9, the ECSOA is better than GSA, MFO, MVO, GA, CPSO, and HS algorithms in other literature. e ECSOA is also better than the PSO, SA_GA, GSA, SCA, MVO, and SOA. erefore, the ECSOA can resolve the problem.

Pressure Vessel Design Problem.
is is also the least fabrication cost problem of four parameters and four constraints.
e parameters of the structural system are shown in Figure 12 [7]. Some of the algorithms are taken from other literature as follows: MFO [7], ES [47], DE [48], ACO [49], and GA [50]. For the problem, the ECSOA is compared to the PSO, SA_GA, GSA, SCA, MVO, and SOA and provides the best-obtained values in Table 10.

Mathematical Problems in Engineering
For the problem, the ECSOA is better than the MFO, ES, DE, ACO, and GA algorithms in other literature. e ECSOA is also better than the PSO, SA_GA, GSA, SCA, and MVO. ere is not much difference between the optimal value of ESOA and that of SOA. erefore, ECSOA can resolve the problem.

Cantilever Beam Design Problem.
is is a problem that is determined by five parameters and is only applied to the scope of variables of constraints. e parameters of the structural system are shown in Figure 13 [7]. Some of the algorithms are taken from other literature as follows: MFO [7], CS [51], GCA [52], MMA [52], and SOS [53]. For the problem, the ECSOA is compared to the PSO, SA_GA, GSA, SCA, MVO, and SOA and provides the best-obtained values in Table 11.
In Table 11, the ECSOA proves to be better than the MFO, CS, GCA, MMA, and SOS algorithm in other literature. e ECSOA is also better than the PSO, SA_GA, GSA, SCA, and MVO. ere is not much difference between the optimal value of ECSOA and that of SOA. erefore, the ECSOA can resolve the problem.

Gear Train Design Problem.
is is a minimum gear ratio problem, which has four variables and a scope of variables of constraints. Figure 14 is the schematic diagram [7]. Some of the algorithms are taken from other literature as follows: MFO [7], MVO [9], CS [51], ABC [54], and MBA [54]. For the problem in this paper, the ECSOA is compared to the PSO, SA_GA, GSA, SCA, MVO, and SOA and provides the best-obtained values in Table 12.
In Table 12, the ECSOA proves to be better than the MFO, MVO, CS, ABC, and MBA algorithm in other literature. Except for the SA_GA, GSA, and PSO, the ECSOA is also better than the SCA, the MVO, and the SOA. e result of the ECSOA has reached the theoretical best solution, although the optimum of the ECSOA is worse than that of the SA_GA, GSA, and PSO. e ECSOA finds a new value. erefore, the ECSOA can resolve the problem.

ree-Bar Truss Design Problem.
is is a minimize weight problem under stress, which has two variables and only applies to the scope of the variables of constraints. Figure 15 is the schematic diagram of the components [7]. Some of the algorithms are taken from other literature as follows: MFO [7], MVO [9], CS [51], MBA [54], and DEDS [55]. For the problem, the ECSOA is compared to the PSO, SA_GA, GSA, SCA, MVO, and SOA and provides the best values in Table 13.
In Table 13, except for the MVO and the PSO, the ECSOA is better than the others. e best value of the ECSOA has reached the theoretical best solution, although the optimum of the ECSOA is worse than that of the MVO and the PSO. erefore, the ECSOA can resolve the problem.

I-Beam Design Problem.
is is a minimize vertical deflection problem that has four variables and a constraint. Figure 16 is the design diagram [7]. Some of the algorithms are taken from other literature as follows: MFO [7], CS [51], SOS [53], IARSM [56], and ARSM [56]. For the problem, the ECSOA is compared to the PSO, SA_GA, GSA, SCA, MVO, and SOA and provides the best-obtained values in Table 14.
In Table 14, except for the MFO, GSA, SOA, and SA_GA, the ECSOA is better than the others. e fitness of the MFO is the best. Although the most minor vertical deviation of the ECSOA is not as good as that of the GSA, the SOA, and the SA_GA, it is very close to other relative optimal values. erefore, the ECSOA is an effective and feasible solution to the I-beam design optimization problem.
In brief, the ECSOA proves to be better than the other algorithms in most actual studies. e ECSOA can resolve these practical problems.     Figure 13: Cantilever beam design problem.  Figure 14: Gear train design problem.  Figure 15: ree-bar truss design problem.

Conclusion
An ECSOA is presented, with a completely elastic collision, completely inelastic collision, and non-completely elastic collision method. According to the four-phase test of the ECSOA from different perspectives, it improved the SOA, the benchmark function optimization, the PID control parameter optimization problems, and the constrained engineering problems.
In the first phase, the SOA is improved in seven different ways: the SOA based on parameter change (PCSOA), the SOA based on parameter adaptive Gaussian transform (PAGTSOA), the SOA based on Levy variation (LVSOA), the SOA based on refraction reverse learning mechanism (RRLOOA), the SOA based on mutual benefit factor strategy (MBFSOA), the SOA based on Cauchy variation (CVSOA), and the SOA based on elastic collision (ECSOA). Each improved algorithm was optimized for the fifteen functions. e result is that the ECSOA is feasible in the benchmark functions. In this phase, we consider the ranking values of 30 independent runs between the ECSOA mean values, the standard deviation values, the best fitness values, the best fitness values rank, the convergence curves, and the variance tests for the global minimum values.
In the second phase, fifteen benchmark function optimization problems are used to test the ECSOA further. e ECSOA is compared to the PSO, SA_GA, GSA, SCA, MVO, and SOA for verification. It was observed that the ECSOA is feasible and competitive in benchmark functions. e second test phase is also about the ranking values of 30 independent runs between the ECSOA mean values, standard deviation values, best fitness values, best fitness values rank, convergence curves, and variance tests for the global minimum values. In the benchmark function optimization problems, the complexity analysis of the ECSOA is  Figure 16: I-beam design problem. Mathematical Problems in Engineering researched, and the overall calculational complexity of the ECSOA is almost the same as that of the basic SOA. Wilcoxon's rank-sum test is studied, and the ECSOA proves to be better than the other six algorithms. Based on the run time comparison of seven algorithms in benchmark functions, the ECSOA has relatively more program running time, and it is not optimal in terms of running time. From the results of the performance ratios of the average solution for the seven algorithms, the optimization probability of the ECSOA is the highest.
In the third phase, the four PID control parameter optimization models were used to test the ECSOA in practice further. e problems were a parameter optimization model of second-order PID controller without time delay, a parameter optimization model of PID controller with first-order micro delay, a parameter optimization model of first-order PID controller with significant time delay, and a parameter optimization model of high order PID controller without time delay problems. e third test phase also considered the ECSOA mean values, standard deviation values, best fitness values, best fitness values rank of 30 independent runs, convergence curves, and ANOVA. From the results of PID parameter optimization problems, the ECSOA was compared to various algorithms. e results show that the ECSOA is effective and feasible in practical problems.
Eventually, in the last phase, six engineering problems further tested the ECSOA. e ECSOA was compared to various algorithms. e results prove that the ECSOA is the highest competitive algorithm for the practical optimization problems.
According to the comparative analysis of the experiments, the conclusion is as follows: (1) e elastic collision strategy includes the completely elastic collision, the completely inelastic collision, and the noncomplete elastic collision. e three different situations of elastic collision strategy tend to generate random seekers, increase the diversity of the seeker, increase the search space, and avoid premature convergence.
(3) Among the seven algorithms (PSO, SA_GA, GSA, SCA, MVO, SOA, and ECSOA), the ECSOA optimization benchmark function has the highest optimization capability. (4) e ECSOA optimization benchmark functions have almost the same calculational complexity as the SOA. (5) e running time of the ECSOA optimization benchmark function is relatively high. Among the seven algorithms compared, the running time is only better than that of the SA_GA. (6) e ECSOA can solve real challenging problems, such as the PID control parameter optimization problems and the classical constrained engineering optimization problems.
(7) Further improving and application can be incorporated into future studies. e improved SOA and the heuristic algorithms based on those improved strategies can be applied not only to engineering optimization problems, but also to path planning problems, pattern recognition, intelligent control and other fields, and many practical application optimization problems that cannot be solved by traditional methods. Except the methods used in the paper, some of representative computational intelligence algorithms can be used to solve the problems, such as the MBO, EHO, MS, SMA, and HHO.

Data Availability
e data used to support the findings of this study are available from the corresponding author upon request.

Conflicts of Interest
e authors declare that there are no conflicts of interest regarding the publication of this paper.