A Modified Salp Swarm Algorithm Based on the Perturbation Weight for Global Optimization Problems

Metaheuristic algorithms are often applied to global function optimization problems. To overcome the poor real-time performance and low precision of the basic salp swarm algorithm, this paper introduces a novel hybrid algorithm inspired by the perturbation weight mechanism. .e proposed perturbation weight salp swarm algorithm has the advantages of a broad search scope and a strong balance between exploration and exploitation and retains a relatively low computational complexity when dealing with numerous large-scale problems. A new coefficient factor is introduced to the basic salp swarm algorithm, and new update strategies for the leader position and the followers are introduced in the search phase. .e new leader position updating strategy has a specific bounded scope and strong search performance, thus accelerating the iteration process. .e new follower updating strategy maintains the diversity of feasible solutions while reducing the computational load. .is paper describes the application of the proposed algorithm to low-dimension and variable-dimension functions. .is paper also presents iteration curves, box-plot charts, and search-path graphics to verify the accuracy of the proposed algorithm. .e experimental results demonstrate that the perturbation weight salp swarm algorithm offers a better search speed and search balance than the basic salp swarm algorithm in different environments.


Introduction
In many engineering fields, there are numerous optimization problems that must be solved under complicated constraints, over large search domains and high complexities [1][2][3]. Traditional mathematical strategies, such as the steepest descent method and the variable scale method, can only calculate simple and continuous functions [4,5]. us, complex features such as nonlinearity, multiple variables, multiple constraints, and multiple dimensions require new optimization strategies that have strong calculation abilities and a high degree of precision [6][7][8]. Intelligent metaheuristic algorithms have received considerable attention from researchers, and rapid improvements in such techniques have been made in recent years as a result of their widespread utilization, enhanced computational technologies, high practicability, and fault-tolerant abilities [9][10][11][12][13].
Optimization algorithms have strong prospects related to numerous practical industrial fields and theoretical mathematics applications, such as global numerical optimization [14], path planning [15], clustering analysis [16], 0-1 knapsack problems [17], image segmentation [18], PID tuning [19], obstacle avoidance of robotic manipulator [20], and feature selection [21]. All of those areas need algorithms to obtain more precise parameters. In recent years, scholars have proposed many advanced metaheuristic algorithms, such as monarch butterfly optimization (MBO) [22], beetle antennae search algorithm (BAS) [23], earthworm optimization algorithm (EWA) [24], elephant herding optimization (EHO) [25], crow search algorithm (CSA) [26], and moth search algorithm (MS) [27]. MBO, which is mainly determined by the migration operator and butterfly adjusting operator, is ideally suited for parallel searching and well capable of balancing trade-off between intensification and diversification. BAS not only has the ability of individual recognition and environmental recognition abilities but also owns the simple code. In EWA, the addition of a Cauchy mutation can make certain earthworms escape from local optima and enhance the algorithm searching ability and can also help the whole earthworm positions proceed to a better position. EHO is divided into two operators including the clan updating operators and separating operators. e worst elephant position is replaced by randomly generated positions, which can significantly accelerate convergent speed, avoiding premature and local convergence. CSA applies a population of seekers to explore the searching space, by the use of a population, the probability of searching a feasible position and escaping from local optima increases. MS searching process can be seen as exploitation and exploration, and the act of balancing of exploitation and exploration is indeed. MS has a good performance and effectiveness. e salp swarm algorithm (SSA), a nature-inspired metaheuristic algorithm, is proposed by Mirjalili et al. in 2017 [28], which displays some promising performance for global optimization functions. SSA imitates the salp living and predation habits, and the mathematical model of SSA can be divided into two groups including one leader and followers. e leader is the first salp in front of the salp chain, whereas other salps can be seen as followers. e leader indirectly guides the salp swarm to follow each other. SSA has exploration and local optima avoidance abilities, which originate from the reason that salps tend to interact with each other; so, salps do not gravitate towards a local feasible solution easily. e salp chain makes the SSA search the finding space and gradually move to the global optimum, which demonstrates the superior exploitation of SSA. SSA converges towards the food position proportional to the iteration number because the connections between the leaders also pull other salps towards the food position. In addition, it is observed that SSA can balance exploration and exploitation. Owing to the distinguishing characteristics including simple code and easy implementation, it is becoming one of the most studying hot areas in algorithm fields, such as node localization in wireless sensor networks [29], the Takagi-Sugeno fuzzy logic controller design [30], the extreme learning machine optimization [31], the IIR wideband digital differentiators and integrators design [32], the photovoltaic cell models parameters identification [33], PEM fuel cells parameters extracting [34], the passive sonar target classification [35], the airfoil-based savories wind turbine optimization [36], the model predictive controller devising [37], and the soil water retention curve parameter estimation [38]. ere are different salp swarm algorithm variants that are used in many areas. Wan et al. [39] proposed that the MPPTcontroller is achieved by combining the salp swarm algorithm with the grey wolf optimizer. Gao et al. [40] combined SSA with quantum swarm intelligence and proposed the quantum salp swarm algorithm to solve Nakagami-m quantile functions. Xing and Jia [41] introduced the Lévy flight salp swarm algorithm which can eliminate the problem of getting stuck in local optima and applied the proposed algorithm for multilevel color image segmentation. e literature [42] designed an advanced Lévy flight salp swarm algorithm for hydraulic systems. Majhi et al. [43] drafted a chaotic salp swarm algorithm based on the fire neural model and quadratic integration. Neggaz et al. [44] created improved leaders of the salp swarm algorithm using the sine cosine algorithm and disrupt operator, the updating position it consists to update the leader position by the sine cosine algorithm and applying the disrupt operator. e literature [45] applied diversities of the moth-flame optimization (MFO) algorithm to weaken the limitations of basic SSA and the proposed algorithm called SSAMFO. Ibrahim et al. [46] devised the hybridization algorithm SSAPSO between SSA and PSO, in which the exploration and the exploitation steps of SSA are improved. Panda and Majhi [47] introduced an improved version of SSA, which can boost the s searching performance of SSA by using space transformation search. In literature [48], a new SSA binary version called BSSA was drafted based on an Arctan transformation. In literature [49], a novel hybrid algorithm based on SSA and chaos theory was proposed, and the capability of proposed algorithm in finding an optimal feature subset can enhance the classification accuracy. Wu et al. [50] proposed an improved salp swarm algorithm based on weight factor and adaptive mutation, and testing results showed the good convergence performance of escaping local optimum when compared with basic SSA.
Xiang et al. [51] proposed a modified salp swarm algorithm called polar coordinate salp swarm algorithm (PSSA), which is inspired by the spiral aggregation chain and foraging trajectory of salps. Hegazy et al. [52] added a new control parameter in basic SSA to adjust the present best solution, and the new method is called the improved salp swarm algorithm (ISSA). Qais et al. [53] introduced an enhanced salp swarm algorithm (ESSA) to improve the power point tracking and the fault ride-through capability of a grid-tied permanent magnet synchronous generator driven by a variable speed wind turbine. e leader salp searches for the best solution to the given problems using the difference between the lower searching bound and the upper searching bounds, which causes that the local optimum cannot be sufficiently utilized for the optimization procedure in basic SSA. e expression factor with a fixed coefficient is the e exponential function, and the exponential function will emerge the exponential disaster in the later iteration phase, which causes premature convergence. e followers update their next positions by applying for their neighbor positions. is single updating mechanism is unfavorable in terms of algorithm diversity. To overcome the above problems and enhance the performance of SSA, this paper describes the perturbation weight salp swarm algorithm (PWSSA). A new coefficient factor, a new leadership position updating strategy, and new followers updating strategy are added to the basic salp swarm algorithm. PWSSA uses the perturbation weight mechanism to weaken the distance difference of the best solution and each solution and applies the asymptotic circular searching mechanism to find a better leader position at a faster speed. Followers' positions will be changed more and more slightly with increasing iterations in PWSSA. PWSSA can balance the leader position, and other followers' positions can weaken the exponential explosion problem in basic SSA. As 2 Complexity a result, the effectiveness of search orientation is significantly enhanced. For experiments and discussion, this paper used different algorithms to carry on different function experiments including low-dimension functions and variable-dimension functions, and iteration curves, box-plot charts, and searching path graphics were given to show the strong searching performance of PWSSA. All experiment results demonstrate that the proposed algorithm has a stronger searching accuracy and larger exploration balance than the basic salp swarm algorithm. e rest of this paper is organized as follows: in Section 2, the basic salp swarm algorithm is presented. In Section 3, the perturbation weight salp swarm algorithm is proposed. Experimental parameters, experimental environments, results, and discussion are given in Section 4. In Section 5, the conclusion is drawn.

Salp Swarm Algorithm
e salp swarm living in the sea is a transparent organism that is similar to jellyfish. Mirjalili et al. introduced the salp swarm algorithm depending on the salp predation strategy, which is a chain-like behavior relying on the chain mechanism of the group. SSA, which is based on chain behavior to find the optimal solution, is one of the evolutionary metaheuristic algorithms. In the salp swarm, all salps are divided into two parts including a leader and followers. e salp presented at the front of the salp chain is the leader, whereas other salps can be seen as the followers. In the procedure of the salp predation mechanism, the leader in the front of the chain guides followers search food, and all followers, which follow the recent salp, deliver food signals to keep the flexibility chain shape.
In this paper, each salp position is set to find food in an N × D dimension searching space, where N represents the population size, and D represents the searching dimension. Hence, i th salp position x i d(i�1,2,...,N),(d�1,2,...,D) in the in the d th dimension can be represented as e leader position x 1 d(d�1,2,...,D) is assigned in d-dimension searching space. e food source, which can be seen as the best solution in functions, is also set to be present in the searching area and is targeted by the salp swarm chain. e leader updates its position according to the food source position. e position of the leader can be represented as where x 1 d is the leader position, F d is the food position, ub d is the upper bound of d th dimension searching space, and lb d is the lower bound of d th dimension searching space. Parameters p, c 2 , and c 3 are random numbers uniformly obtained in the interval of [0, 1]. e parameter c 1 indicates the expression coefficient and can be represented as where t is the current iteration number, T is the maximum number of iterations, and e is the natural base.
In each searching procedure, each follower tracks the leader position by following other followers. Each follower position can be defined as follows: where i ≥ 2, x i d , and x i− 1 d mean one ith follower position and its neighbor position in d th dimension. e pseudocode of the basic SSA is given in Algorithm 1. e SSA main step can be summarized in the pseudocode as follows:

Perturbation Weight Salp Swarm Algorithm
e leader guides followers find food according to the difference of the lower searching bound and the upper searching bound in SSA, but if the searching problem has large-scale optimization fields, the large searching scope makes that the local searching is not a sufficient optimization process, and the neighborhood information near the local optimization optimum is insufficiently applied. e expression factor c 1 is the e exponential function with a fixed coefficient, and the exponential equation will grow explosively at the later of iteration; therefore, the leader searching strategy has drawbacks of premature convergence and low searching precision. e parameter c 2 is randomly selected in the range of [0, 1], which is not suitable for high-precision searching in the later of iterations. Positions of other salps are updated by the average position of each follower and its neighbor. Updated positions of followers have a single direction and blindness, which cause that SSA falls into the local extremum and maximize the viciousness of iterations. To get a better searching strategy and avoid the blindness of the searching process, this paper added the variable perturbation weight mechanism into the basic SSA, and the proposed algorithm is called the perturbation weight salp swarm algorithm (PWSSA). e perturbation weight mechanism works by changing the distance difference of the optimum solution and the population solution. e searching range is regulated by applying the asymptotic circular searching to obtain a better leader searching strategy with a faster speed and higher balance. e followers' position will achieve better, and the position adjusting will change more and more slightly with increasing iterations. e perturbation weight mechanism will make SSA get the optimum solution better. New factors c 1 and c 2 can be updated as follows: Complexity where u 1 and u 2 meet the standard normal distribution, u 1 ∼N (0, 1) and u 2 ∼N (0, 1). e standard normal fraction has advantages of the concentration, the symmetry, and the uniform variability. e new leader position can be updated as follows: To increase diversities of followers' positions, the multidirectional crossing searching strategy is added in the basic SSA: where w 1 , w 2 , and w 3 are random parameters in the range of [− 1, 1]. e specific steps of PWSSA are described as follows: Step 1. Set salp population size N and the searching dimension D. Define the maximum number of iterations T. Determine probability coefficient p. Let t � 0. Each i th salp position can be seen as x i d (i�1,2,. . .,N) and (d�1,2,. . .,D). Set the probability coefficient p.
Step 2. Begin the iteration. Judge whether i is equal to one. If i is equal to one, jump into Step 3. If i is not equal to one, jump into Step 4.
Step 3. Use equation (5) to compute the factor c 1,new . Use equation (6) to compute the factor c 2,new . Set c 3 in the range of [0, 1]. Compute the current optimal solution F d . Judge whether c 3 is larger than p. If c 3 is larger than p, the leader position can be expressed by part one of equation (7). Otherwise, the leader position can be expressed by part two of equation (7).
Step 5. Record the global optimal solution. If there is a better solution, replace F d .
Step 6. Set t � t + 1. Judge whether the current iteration t is equal to the maximum number of iterations T; if t is equal to T, stop the iteration. If not, jump to Step 2. e PWSSA main step can be summarized in the pseudocode shown in Algorithm 2, and the PWSSA main step flow chart is shown in Figure 1

Experimental Parameters and Environments.
Benchmark function testing is a popular and common way to indicate the performance of intelligent algorithms. is paper introduces benchmark functions to exhibit the superior performance of the proposed algorithm, and the proposed algorithm will be evaluated on classical benchmark functions in this section. To testify the ability of the proposed algorithm to solve different dimensional complex functions, eight low-dimension functions (f 1 -f 8 ) and four variabledimension functions (f 9 -f 12 ) were chosen for algorithm testing in Table 1. In Table 1, D, scope, and aim represent the function dimension, the searching range, and the ideal value.
Low-dimension functions (f 1 -f 8 ) are applied to measure the global searching ability of the algorithm. Variable-dimension functions (f 9 -f 12 ) are very difficult to converge to the global optimal solution because of owning unevenly  [54], Lévy flight trajectory-based whale optimization algorithm (LWOA) [55], and  (7) Update followers positions by equation (8) Update the global optimum solution Complexity Table  1: Basic information on benchmark functions.
Sum of different powers Lévy flight salp swarm algorithm (LSSA) [41]. All algorithm processes and details can be found in the original algorithm literature. SA is inspired by analogy to the physical annealing procedure in metals, which is a local searching algorithm proposed in the early 1980s. e theory of the annealing procedure is to heat the solid-state metal to a large temperature, so that atoms of the metal are in a stochastic condition and then cool the metal down slowly according to particular procedures. Starting from some random solutions and fixed initial temperature, SA controls the process by metropolis criterion and a group of parameters called cooling schedule. SA has two initial parameters including the initial temperature T 0 and the decay factor k. For SA, parameter T 0 selects 100, and parameter k selects 0.

Date Discussion.
To demonstrate the optimization effect, four indicators were selected to comprehensively evaluate the competitiveness of different algorithms. Four indicators consist of the best searching value (best), the worst searching value (worst), the median (med), and the standard deviation (std). Fixed two-dimension functions testing results and two dimensions of variable-dimension functions testing results are given in Table 2. Other variabledimension functions (100D) testing results are shown separately in Table 3. Tables 2 and 3 show that all searching values of the proposed algorithm are much closer to the ideal value in Table 1, which demonstrates that PWSSA not only can obtain the best aim but also have strong searching abilities. As the dimension of the testing function increases, the accuracy of the algorithms will decline, but the test results using PWSSA are consistently better than those using other algorithms. e convergence precision and optimization success ratio of the proposed algorithm is better than those of the other algorithms for all test functions. When a set of data changes significantly, the median can be used to illustrate the centralized trend of the data. PWSSA has the smallest median value of all the test results, indicating an outstanding performance compared with the other algorithms. PWSSA also has the smallest standard deviation of all the algorithms, demonstrating that the proposed algorithm offers good stability and produces relatively few poor results. Standard deviation, which can measure the discrete degree of a dataset, is the arithmetic square root of the variance. In other words, a large standard deviation exhibits a large difference between most values and the average value, and a small standard deviation shows that the calculated value is closer to their average value. PWSSA has the smallest standard deviation than those of other algorithms, which display that the proposed algorithm has good stability and a few poor results. In PWSSA, the good solution in the current iteration is applied by followers to find the better solution in the next iteration, and random factors can enhance diversities of solutions in nonlinear high-dimension problems, so it can be seen from testing results that in f 7 , f 8 , and f 12 , PWSSA can achieve the best optimization results on best, worst, mean, and std values.

e Wilcoxon Rank Sum Test Discussion.
e rank sum test is a nonparametric technique used to define whether a result is statistically significant. e nonparametric statistical test can be used in mathematics fields to check the algorithm performance [56]. e rank sum test arranges all data in order, from small to large, and has strong practicality because there is no special form of dispersed data or known distribution. However, the rank sum test ignores absolute value differences in data testing, which not only makes the test result approximate but also causes the loss of some test information. Wilcoxon improved the basic rank sum test by considering the different directions and sizes of the data. e Wilcoxon rank sum test can be applied to a distribution of data to check any differences among them and offers more effective performance than the basic rank sum test. e Wilcoxon rank sum test produces p values: if the p value is less than 0.05, there is a significant difference at a level of 0.05. To further compare PWAAS performances with those of other algorithms, the Wilcoxon signed rank test was tested in this paper. All p values are given in Table 4, and this paper applied the proposed algorithm results against those of other algorithms at the 0.05 significance level. For SSA, the p values of function 1 and function 6 are equal to 0.011 and are larger than 0.05. For SA, the p values of function 4 is equal to 0.473 and is larger than 0.05. For LWOA, the p values of function 2 and Complexity functions 4-6 are larger than 0.05. For LSSA, the p values in function 6 is equal to 0.011 and is larger than 0.05. Other results are all less than 0.05. e Wilcoxon rank sum test results display that the proposed algorithm owns the strongest searching efficiency and the greatest finding mechanism around the best solution, which further proves that PWSSA has the wonderful searching performance.

Iteration Curves Discussion.
Iteration is the activity of repeating a feedback procedure with the purpose of finding the desired goal. Each repetition of all procedures in an algorithm is called one iteration, and the result of each iteration provides the initial value for the next iteration. To exhibit the convergence speed and global search ability of all algorithms more intuitively, the average convergence curves of all  Figures 2 and 3. Single logarithmic coordinates are used in this paper for a more detailed analysis. Figure 2 exhibits two-dimensional convergence curves of PWSSA and its competitors. Figure 3 demonstrates iteration graphs of algorithms at variable-dimension functions (100D), respectively. Note that all convergence curves discussed in the following subsections are the averages of ten independent executions. As the dimension increases, the optimization performance and iteration speed of all algorithms decrease, although the performance degradation of PWSSA is not severe. e proposed algorithm achieves the target value for most functions with the fastest iteration speed and highest efficiency. e LSSA has better iteration rates than SSA in most functions but still cannot outperform the proposed algorithm. It is noticeable that PWSSA gives the superior global iteration rate and accuracy in comparison with original SSA, which is easy to be trapped to the local optimal. All figures reveal that SSA will be much poorer as the dimension increases, while the proposed algorithm still can offer the distinguished searching ability and its convergence speed and precision rank number one in all functions. In other words, PWSSA can apply fewer iterations to solve problems and is more competent than other algorithms. PWSSA enormously boosts the iteration speed and searching ability of basis SSA mainly because of the introduction of the many-sided learning and local random perturbation strategies between successive followers positions.

Box-Plot Charts
Discussion. Box-plot charts are used to show dispersion information about a set of data. ey have the advantages of detecting abnormal values and data skewness and are widely used to distinguish algorithm capabilities in terms of data symmetry and data dispersion. ere are six parameters in a typical box-plot chart, namely, the maximum value, minimum value, median, upper quartile, lower quartile, and outliers. A set of data can be evaluated using five of these parameters. Figures 4 and 5 show box-plot charts of all algorithms after calculating a different function.
ere are many local optima in highdimensional functions, so the aggregation degree of solutions is a crucial index for evaluating algorithm performance. If an algorithm becomes trapped around a local extremum, it can result in premature convergence. PWSSA produces the narrowest box-plot charts and fewest outliers for all functions. e median and upper/lower quartiles computed by PWSSA are lower than those given by the other algorithms, demonstrating that the collaborative random search strengthens the capability for individual diversity and avoids premature convergence. It is apparent that the proposed algorithm tends to obtain the best performance in precision on most functions as the dimension increases, which is mainly contributed by followers' random positions generated, and SA and LWOA have the worst performances. SA has the largest variance in all algorithms. All box-plot     Complexity charts demonstrate that the proposed algorithm has large robust and big stability in comparison with other algorithms, and the figures can show that PWSSA can avoid local extremum.

Searching Paths Discussion.
To further discuss the powerful searching capability and optimization performance in PWSSA, Figure 6 gives the optimal PWSSA search path, the optimal SSA search path, and the contour plot of each function in the two-dimension plane. Searching path figures can examine whether the algorithm will fall into the local optimal solution on complex functions. rough comparison of searching paths with the traditional SSA algorithm, all searching paths of PWSSA are shorter than SSA, which demonstrate the efficiency of PWSSA in function problems. PWSSA also applies the finding mechanism of tightening from the neighborhood to the extreme point due to average optimality and constrained average optimality. From Figure 6, we can find that the PWSSA searching path is much less than the SSA searching path; SSA has many repeat-invalid short-distance searching paths and occasional long-distance searching paths. Two sets of searching paths display that compared with the basic SSA, PWSSA can explore a wider range and is less affected by iterations, so PWSSA owns better general-purpose optimization abilities. PWSSA also is more flexible and can not only completely avoid collisions with obstacles but also provide numerous feasible solutions. e proposed algorithm can balance the searching speed and accuracy and provide a brilliant and satisfactory solution as much as possible in the case of meeting variable-demand requirements. All searching   paths results can reveal that PWSSA can quickly get the best solution and can be used to scenarios with high requirements in a real-time environment.

Time Complexity
Discussion. e algorithm complexity is divided into time complexity and space complexity. e time complexity refers to the computational workload required to execute the algorithm, and the spatial complexity refers to the memory space in a computer required to execute the algorithm. In computer science, the algorithm time complexity, which is a function, qualitatively describes the algorithm running time and is usually expressed by symbol O (f (n)), where f (n) means the mathematical function which includes n, n 2 , and log n . In this way, the time complexity can be called asymptotic   × N × D), where max_it is the number of running times. To comprehensively compare the time complexity of different algorithms, this paper calculated running times of all algorithm for two-dimension functions and selected the two times of the worst searching value in all algorithms as the aim value. To comprehensively show the time complexity, this paper selected three indicators including the maximum number of running times (MAX), the minimum number of running times (MIN), and the average number of running times (AVE) for ten independent runs. All testing results are shown in Table 5. Table 5 shows that the maximum number of running times, the minimum number of running times (MIN), and the average number of running times in the proposed algorithm are smaller than those of other algorithms. And PWSSA is not easy to fall into local optimum. In addition, the results also show that PWSSA has superior searching ability because PWSSA can get results with good precision than other algorithms. In summary, the main reason is that PWSSA owns an excellent balance between global and local searching phases.

Conclusions
Metaheuristic optimization is a significant area, and most representative computational intelligence algorithms have permeated into almost all areas of science and engineering. SSA is a typical metaheuristic algorithm proposed in 2017. Despite SSA is success and popularity, there are some issues that need to be addressed in basic SSA. To overcome the problems of poor real-time stability and low precision of basic SSA for global function optimization problems, this paper has introduced a modified version based on the perturbation weight. PWSSA mainly relies on an asymptotic circular search strategy, which can achieve fast local searching and information orientation. PWSSA efficiently moves towards better function values and offers strong realtime performance. e proposed algorithm can effectively escape premature convergence in the early searching phase and avoids missing the global optimal solution in the later searching phase, thus achieving better search diversity and the possibility of finding a better solution. is paper has described the results of tests using eight low-dimension and four variable-dimension functions. In comparison experiments against other algorithms, PWSSA consistently obtained the best solutions and the smallest function values. is paper has described the results of tests using eight lowdimension and four variable-dimension functions. In comparison experiments against other algorithms, PWSSA consistently obtained the best solutions and the smallest function values. e proposed algorithm can give highquality searching abilities for functions, which is reflected in the fact that PWSSA can get a more competitive precision than those of comparison algorithms. Iteration curves, boxplot charts, and search-path graphics were used to illustrate the effectiveness of the proposed algorithm, and the results show that PWSSA can generally obtain better solutions than previous algorithms. rough the analysis and comparison of the results using the different mathematical methods, the superiority of the proposed algorithm was proved. It has been proved by the no-free-lunch (NFL) [57] that metaheuristic optimization algorithms are able to solve all optimization problems. In other words, all metaheuristics perform similarly when solving all optimization problems, except the methods used in the paper; some of the most representative computational intelligence algorithms can be used to solve the problems, such as the monarch butterfly optimization (MBO) [22], earthworm optimization algorithm (EWA) [24], elephant herding optimization (EHO) [25], and moth search (MS) algorithm [27]. In future work, we will focus on the proposed algorithm used to solve industrial application problems.

Data Availability
e data used to support the findings of this study are available from the corresponding author upon request.

Conflicts of Interest
e authors declare that there are no conflicts of interest.