A Novel Memetic Algorithm Based on Multiparent Evolution and Adaptive Local Search for Large-Scale Global Optimization

In many fields, including management, computer, and communication, Large-Scale Global Optimization (LSGO) plays a critical role. It has been applied to various applications and domains. At the same time, it is one of the most challenging optimization problems. This paper proposes a novel memetic algorithm (called MPCE & SSALS) based on multiparent evolution and adaptive local search to address the LSGO problems. In MPCE & SSALS, a multiparent crossover operation is used for global exploration, while a step-size adaptive local search is utilized for local exploitation. A new offspring is generated by recombining four parents. In the early stage of the algorithm execution, global search and local search are performed alternately, and the population size gradually decreases to 1. In the later stage, only local searches are performed for the last individual. Experiments were conducted on 15 benchmark functions of the CEC′2013 benchmark suite for LSGO. The results were compared with four state-of-the-art algorithms, demonstrating that the proposed MPCE & SSALS algorithm is more effective.


Introduction
Optimization problems widely exist in many fields such as engineering design, economic management, production scheduling [1][2][3][4], wireless communication, and computer science. Some of these problems have many decision variables that create Large-Scale Global Optimization (LSGO) problems. Without loss of generality, a large-scale global optimization can be formulated as a minimization problem as given in where D ≥ 1000 is the dimension size.
Large-scale global optimization is one of the most challenging optimization problems where the search space grows exponentially with increasing the problem dimensionality. Such a massive increase in problem dimensions usually changes the search properties. Consequently, the small-scale unimodal function may change to a multimodal function when the number of dimensions increases [5]. erefore, researchers have proposed many improved algorithms based on the existing classic algorithms. For example, in [6,7], the Particle Swarm Optimization (PSO) algorithm was improved using subswarms to maintain diversity. Also, in [8], an improved PSO algorithm containing two types of learning strategies was provided. Multiple Offspring Sampling (MOS) [9] is a hybrid algorithm that combines a Genetic Algorithm (GA) and two local searches. MLSHADE-SPA [5] also is a hybrid algorithm that is based on three Differential Evolution (DE) strategies and a modified Multiple Trajectory Search (MTS) [10] algorithm. IMLSHADE-SPA [11] is an improved MLSHADE-SPA with a novel local search method. However, SHADE-ILS [12] is an enhanced version of the SHADE algorithm. It combines two different local search methods and uses a restart mechanism. Algorithms in [13][14][15][16] are modified algorithms based on Cooperative Coevolution (CC) and Differential Evolution. CBCC-RDG3 [17] is also a modified version of the CC algorithm that modifies the recursive differential grouping method to reduce the overlapping problems. TPHA [18] and DECC-RAG1.1 [19] are two-phase hybrid algorithms that use the CC framework.
To encourage the research on LSGO, IEEE Congress on Evolutionary Computation (IEEE CEC) organizes LSGO algorithm competitions yearly or biennial. Since 2013, it has been performed on the CEC′2013 LSGO benchmark suite [20]. MOS was the winner in the years 2013-2018. Moreover, MOS and the other 11 excellent algorithms that did not join the competitions were compared in [21]. Again, the MOS algorithm overperformed all the other algorithms. However, SHADE-ILS and MLSHADE-SPA were announced in the 2018 competition that they are performing better than MOS. Although CC-RDG3 [17] was announced as the winner of the 2019 competitions, it was not up to the level of the previous winner, SHADE-ILS. It seems that LSGO is still a quite hard nut to crack [21]. e algorithms for LSGO problems can be roughly classified into three categories: standard evolutionary algorithms, CC-based evolutionary algorithms, and memetic algorithms [22]. e memetic algorithm (MA) [23] is a combination of global search and local search. Due to the exploration ability of global search and the exploitation ability of local search, MA performs well in LSGO problems. As mentioned above, the CEC competition award algorithms (e.g., SHADE-ILS and MLSHADE-SPA) and improved algorithm IMLSHADE-SPA based on MLSHADE-SPA are all MAs. In MLSHADE-SPA, IMLSHADE-SPA, and SHADE-ILS, global search and local search have the same position, and the two search methods generate the same number of candidate solutions. However, since the dimension exceeds 1000 for LSGO problems, it is necessary to discuss the case that the numbers of local search and global search are different. Furthermore, at each iteration of the algorithms, the local search is only used to improve the current best solution, and other members cannot be improved, which may miss potential excellent individuals. Besides, the three algorithms all used Differential Evolution and adopted a variety of improvement strategies, which makes the algorithm more complicated. In this case, it is necessary to examine new algorithms and new ideas. is paper proposes a novel memetic algorithm (called MPCE & SSALS) based on multiparent evolution and local search. A multiparent crossover operator is used for global exploration. Furthermore, a step-size adaptive local search algorithm, which is improved from the MTS algorithm, is proposed for local exploitation. e proposed algorithm is inspired by the Simplified Group Search Optimizer (SGSO) [24] algorithm to generate parent vectors and adopt a population size reduction strategy. ere are three main differences between the proposed algorithm and the above memetic algorithms: (1) e proposed algorithm performs much more local searches than global searches. (2) e proposed algorithm performs the local search for every individual. (3) e proposed algorithm uses a new and simpler global search method.
In the following sections, the details of the MPCE & SSALS algorithm will be explained. e algorithm is also compared with four state-of-the-art algorithms, namely, SHADE-ILS,  MLSHADE-SPA,  CBCC-RDG3,  and  IMLSHADE-SPA. e main contributions and novelty of this paper can be summarized as follows: (i) Proposing a novel memetic algorithm for the LSGO problem (ii) Using multiparent crossover and SGSO to solve LSGO problem (iii) Proposing an improved local search algorithm that can be an effective option for LSGO (iv) Demonstrating that local search-dominated hybrid algorithm can effectively solve the LSGO problems. e rest of this paper is organized as follows. In Section 2, the most related work to this paper proposed approach is discussed. Section 3 presents the details of the proposed algorithm. Section 4 explains the numerical experiments of MPCE & SSALS that are carried out using the CEC′2013 benchmark suite and the performance of MPCE & SSALS compared to four algorithms. Finally, conclusions are made, and further research is discussed in Section 5.

Related Works
is section is devoted to presenting the related work needed for understanding the MPCE & SSALS algorithm. Memetic algorithms, multiparent crossover, simplified group search optimizer, and MTS are described.

Memetic Algorithms.
Moscato [23] first proposed the concept of memetic algorithm in 1989. e memetic algorithm is a combination of population-based global search and individual-based heuristic local search. It suggests an algorithm framework. In this framework, different search strategies are used to construct different memetic algorithms. For example, Genetic algorithms, Differential Evolution, Particle Swarm Optimization, and many others can be used for global search strategy. Hill Climbing, Simulated Annealing, Tabu Search, and others can be used for local search strategy. Memetic algorithms have embraced many forms, employing a wide variety of combinations of population-based heuristics and individual improvement heuristics [25], such as [26][27][28][29][30][31][32][33]. Some of these algorithms based on GA and Tabu Search were studied in [26,27]. e memetic model of PSO and local search was introduced in [28][29][30]. A Memetic Artificial Bee Colony Algorithm is also reported in [31]. e combination of a backbone-based crossover operator and a multineighborhood simulated annealing procedure was discussed in [32]. In [33], adaptive memetic computing with a GA, DE, and Estimation of Distribution Algorithm synergy was elaborated. It can automatically activate one of the three algorithms to generate offspring.
SHADE with an iterative Local Search (SHADE-ILS) is a hybrid algorithm that combines a modern DE algorithm, Success-History-based Adaptive DE (SHADE [40]), with two local search methods. In each iteration, the SHADE is applied to evolve the population of candidate solutions. One of the two local search methods is chosen to improve the current best solution found by SHADE. e local search method's selection is according to the improvement obtained by each of them in the previous phase. A restart mechanism has been incorporated into the algorithm to explore new search space regions when the search gets stagnated.
MLSHADE-SPA is a memetic framework that includes three DE algorithms for global exploration and a modified version of MTS (MMTS) for local exploitation. e three DE algorithms are success history-based differential evolution with linear population size reduction and semiparameter adaptation (LSHADE-SPA), enhanced adaptive differential evolution (EADE) [41], and differential evolution with novel mutation and adaptive crossover strategies (ANDE) [42]. e framework also uses the divide-and-conquer method, which randomly divides the dimensions into groups and solves each group separately.
An improved MLSHADE-SPA (IMLSHADE-SPA) framework was proposed in [11], which replaced the local search method (MMTS) with a new local search method and achieved higher performance.
Multiple Offspring Sampling (MOS) [43] is a framework used to combine different metaheuristic algorithms. e participation ratio for each algorithm is adjusted dynamically according to a given strategy. Due to the different algorithms and strategies to be selected, different MOS versions were proposed in [9,44,45]. Our paper focuses on the MOS [9], the winner in the CEC′2013 competition. In MOS [9], three algorithms are combined: GA, Solis and Wets algorithm [46], and MTS-LS1-Reduced algorithm. ese algorithms are executed in sequence, one after the other. e number of candidate solutions to be generated by each algorithm is adjusted dynamically according to the average fitness increment of the newly created individuals [9].

Multiparent
Crossover. Evolutionary algorithms (EAs) have been successfully applied to solve many optimization problems. EAs simulate the evolution process of nature.
ere are three basic operators in EAs: crossover (or recombination), mutation, and selection. e classic crossover operator recombines two parents and generates new offspring. e recombination mechanism determines what parts of each parent are inherited by the child and how this is done. Various crossover operators were proposed for different problems that fit one of the multiple representations for a chromosome [47]. ese crossover operators can be categorized into two categories: exchange-based or calculation-based. e first type of operator is generally proposed for binary coding, but it is also suitable for real coding. Some of the crossover operators' examples are One-point Crossover, Two-point Crossover, Uniform Crossover, and so on. For instance, Uniform Crossover randomly determines whether the child's ith gene is selected from father 1 or father 2. With these crossover mechanisms, each gene in the offspring is copied from one of the parents. e new offspring's chromosome characteristics are directly inherited from their parents without any changes. e second category of the crossover operators is generally used for real coding, such as Average Crossover, Parent Centric Crossover, Heuristic Crossover, Simulated Binary Crossover, and so on. In these operators, the value of each offspring's gene is calculated numerically by the parents' genes. For example, Average Crossover generates the ith gene of the child by averaging alleles from both parents. e first category is more in line with the original concept of gene recombination. In some algorithms, such as the DE algorithm, the second category crossover operator is used as a mutation rather than a crossover operator. It can produce new genes that are different from their parents. is paper tends to define the second type of operators as a hybrid of crossover and mutation operations.
Multiparent crossover extends the two-parent crossover operators to recombine more than two parents for generating new offspring. Many multiparent crossovers have been successfully applied to solve various optimization problems and found to be better than traditional crossovers, such as scanning crossover and diagonal crossover [48,49], multiparent simplex crossover [50], multiparent sequential constructive crossover [47], and a novel multiparent order crossover [51].

Simplified Group Search
Optimizer. Group Search Optimizer (GSO) is a swarm intelligence algorithm with superior performance for multimodal problems [52].
GSO is inspired by animal searching behaviors and group living theory [52]. It includes three types of members: producer, scrounger, and ranger. During each iteration, the individual with the best fitness value in the group, as a producer, will stop and scan the environment to find resources. e scrounger takes a random walk towards the producer to join the resources. A small number of rangers make a random move to avoid entrapment in local minima.
e Simplified Group Search Optimizer (SGSO) [24] is an improved GSO version. It is more efficient and simpler than the original version. It also shows excellent search performance for large-scale optimization problems. In SGSO, the producer abandons environmental scanning. e scrounger adopts an improved join strategy, which moves towards the best member and other excellent members. e rangers use a simple search method and the ranger's percentage decreases. e SGSO is described as follows: (1) In a D-dimensional search space, the ith member at the kth iteration has a current position, Computational Intelligence and Neuroscience (2) Group members are sorted by fitness value in ascending order. e best member x best,k , as the producer, does not move in this iteration. (3) Randomly select 87% of the group members except the producer to perform scrounging. e scroungers move to a new position according to where r 1 and r 2 are uniform random D-dimensional vectors in the range (0, 1) and x m−best,k is a member randomly chosen from the top 4 in the group (except x best,k ). (4) e remaining members are rangers, who take a random step according to where r 3 is a standard normal distribution Ddimensional vector, step is a constant, representing the basic step size, and f is a D-dimensional Boolean random vector indicating which dimensions will change. e probability of change is set to be 1.2/D as given in [24].
where j ∈ {1,2, . . ., D}, rand(1) is a function that produces a uniform random number in the range (0, 1), and j rand is a randomly chosen index ∈ {1,2, . . ., D}, which ensures that at least one component in f is set to 1.

Multiple Trajectory Search.
Multiple trajectory search (MTS) was presented for the large-scale global optimization problem in [10]. It provides three local search methods, where MTS-LS1 is the first and most important one. MTS-LS1 does search from the first to the last dimension successively. Each dimension is subtracted from the search range (SR) value to see whether the objective function value is improved or not. If it is improved, MTS-LS1 proceeds to search in the next dimension. If it is not improved, the solution is restored, and this dimension is added by 0.5 * SR, aiming to see, again, if its value is improved or not. If it is not improved, the solution is restored. Afterward, MTS-LS1 continues to search in the next dimension. SR is initialized to 0.5 * (Upper_Bound − Lower_Bound). If all dimensions are not improved, SR will be cut to half. When SR reaches 1E − 15, its value will be reset to 0.4 * (Upper_Bound − Lower_Bound). MTS-LS1 and its improved versions are used in many algorithms [5,9,12], including the algorithm proposed in this paper.

Proposed Algorithm
In this section, multiparent Crossover Evolution and Step-Size Adaptive Local Search algorithm will be described. Besides, the proposed hybrid algorithm that combines both of them will be introduced.

Multiparent Crossover Evolution (MPCE).
In MPCE, the population is composed of D-dimensional vectors. e number of vectors is called population size, denoted as NP.
e initial population is generated with uniformly distributed random numbers. Each member of the population can produce the next generation through mutation and multiparent crossover operation.
e ith member of the Gth generation is denoted as e main characteristics of MPCE are as follows: (1) e mutation formula is modified from (2) of the SGSO algorithm. e mutant vector is generated according to where x best, G is the best vector in the Gth generation, p-best is the index of a vector which is randomly chosen from the ranked top 10% vectors in the Gth generation (except x best, G ), and r 1 and r 2 are uniform random vectors in the range (0, 1).
(2) MPCE uses a four-parent crossover operation to produce the next generation. e four parents are x i , v i and two excellent individuals randomly selected from the population. e crossover operation could be computed using where j∈{1,2, . . ., D}, a(i) and b(i)∈{1,2, . . .,NP} are the index of vectors randomly chosen from ranked top 50% vectors in the Gth generation, CP1, CP2, CP3 ∈ (0,1) are the crossover probability constant of v i , x a(i) , x b(i) , respectively, r rand(ji) is a uniform random number in the range (0, 1). e parameters CP1, CP2, and CP3 are determined through experiments and set to 0.3, 0.29, and 0.29, respectively.
(3) e population size decreases during the optimization process. With the ongoing iteration, the vectors in the population tend to be gradually assimilated, where the larger NP is less helpful to improve the search performance. Many algorithms apply the population size linear decrease strategies, such as LSHADE algorithm [53]. Besides, for the MPCE & SSALS algorithm, reducing the population size is conducive to deeper local search. In the beginning, the global search and the local search are performed alternately. NP is reduced by 1, and the worst individual in the population is dismissed every several iterations. When NP is reduced to 4, MPCE global search ends and then only the local search is executed to improve the current best solution.

3.2.
Step-Size Adaptive Local Search (SSALS). e basic idea of SSALS derives from MTS-LS1 that is the first local search strategy in the Multiple Trajectory Search (MTS) algorithm.
ese algorithms are designed for single individuals and can also be used for multiple individuals when combined with other algorithms.
Each dimension of the SSALS algorithm has its basic step size, stored in the vector s. In each iteration, SSALS randomly selects one or more dimensions, multiplies the step size of each dimension by a random number, and adds the product to each dimension. If the new solution is better than the original one, these selected dimensions' step size is multiplied by 2. Otherwise, the solution is restored, and each step size is multiplied by −0.5. e step size is initialized using 0.5 * (Upper_Bound − Lower_Bound). e variable minbs represents the minimum step size, an adaptive value that is recalculated in each iteration. If the step size's absolute value reaches minbs, it will be restored using the initial value.
In the case of multiple individuals, the SSALS algorithm key steps are described as follows.
(1) Choose dimensions to be searched according to where f ji ∈ {0,1} indicates whether the jth dimension of the ith vector is to be changed, rand(1) is a function that produces a uniform random number in the range (0, 1), iteration is the number of iterations, and j rand(i) ∈ {1,2, . . ., D} is a randomly chosen index to ensure that x i has at least one dimension to participate in the search. According to (7), the number of dimensions to be searched will rapidly decrease in the iterative process and finally keep at 1.5 per vector on average. is value makes the algorithm has a bit of global search ability in the early stage of the optimization process. (2) Generate the new solution according to where s i,G is a vector, representing the basic step size of the ith individual in the Gth generation. (3) Calculate the variate minbs. SSALS defines a D×5 matrix H, which is used to store each dimension's last five effective step sizes. e effective step size is defined as If more than one vector is improved in an iteration, and some of these vectors' same dimension is changed, save the average effective step size of this dimension into H. e formula for calculating minbs is given in minb s G � min(0.1, min(mean(H, 2))).
(4) Update the basic step size according to

Experimentation
A set of 15 benchmark functions proposed in the CEC 2013 special session on large-scale global optimization was used to study the MPCE & SSALS performance. ese functions are divided into four categories according to the degree of separability. f1-f3 belong to fully separable functions, f4-f11 init_bs ← 0.5 * (Upper_Bound-Lower_Bound); s ← init_bs; Initialize population X (x1, x2, . . ., xNP) randomly; Calculate the fitness: f_value ← cost_function(X); Sort individuals in X based on their fitness; H ← ones(D,5);  6 Computational Intelligence and Neuroscience are partially separable functions, f12-f14 belong to overlapping functions, and f15 is classified as fully nonseparable functions. A detailed description of each of these benchmark functions is given in [20]. MPCE & SSALS performed 25 times for each benchmark function. All tests were completed using MATLAB R2019a. e dimension D of all functions is 1000, except that f13 and f14 are 905. e stopping criterion was a fixed number of fitness evaluations (FEs).
e Max_NFE was set to be 3.0E + 6, and the program terminates when Max_NFE is reached. e initial value of NP was set to 100, CP1 � 0.3, CP2 � 0.29, CP3 � 0.29, I NPD � 100, and I GS � 40. e statistical results including the best, the worst, the median, the mean, and the standard deviation computed over 25 runs are shown in Table 1.

Influence of the Different Components.
In this section, experiments were conducted to observe the influence of the different components. For each test, Table 2 lists the average results of 25 independent runs. Wilcoxon signed-rank test with the significance level of 5% was used for statistical analysis. e ">," "<," and "�" mean "significantly better," "significantly worse," and "no significant difference," respectively. e last row of Table 2 shows the times of win/tie/ loss (w/t/l) in the pairwise comparison.
To observe the individual effect of both Multiparent Crossover Evolution and Step-Size Adaptive Local Search, experiments were executed on the two algorithms, respectively. As shown in Table 2, the optimization performance of SSALS is significantly better than that of MPCE on most of the functions, indicating that local searches contribute more to the hybrid algorithm. e influence of the number of parents was also studied. In the proposed algorithm, CP2 and CP3 are the crossover probability constants of parent 2 and parent 3, respectively. CP2 � 0 indicates that parent 2 does not participate in the evolutionary operation, so is CP3. In one test, CP2 was set to 0, indicating that the three-parent crossover operation was used. In another test, CP2 and CP3 were both set to 0, which means that a two-parent crossover operation was applied. e other settings are considered the same as in Table 1. According to the results, increasing the number of parents affects f5, f6, f9, and f10, but it has no significant difference on other functions. In general, it is beneficial and harmless.
In addition, to verify the improvement effect of SSASL, the same experiments were performed in identical conditions with replacing MTS-LS1 with SSALS. In the comparison test between SSALS with MTS-LS1, it is clear that SSALS is significantly better than MTS-LS1 on all 15 functions, demonstrating that the optimization performance of SSALS is significantly higher than that of MTS-LS1.

Parameter Analysis.
Major parameters of the MPCE & SSALS are I NPD and I GS . For each I NPD iteration, population size (NP) is subtracted from 1. When NP is reduced to 4, the population-based search ends, which means only the individual-based search is performed to improve the best solution. erefore, a smaller I NPD means fewer population searches and more single individual searches. I GS indicates the number of iterations between two global searches, and a smaller I GS represents more global searches and fewer local searches. Different I NPD and I GS are studied in this section. MPCE & SSALS performed 25 times for each combination. Wilcoxon signed-rank test was used for statistical analysis.
To find the appropriate I NPD and I GS value, three CEC′2013 benchmark functions f3, f7, and f15 are studied in the tests with I NPD value varying from 50 to 500 and I GS value varying from 20 to 200, respectively. e other parameters use the same settings as in Table 1. Algorithm performances by adopting different I NPD and I GS values on f3, f7, and f15 are shown in Figures 1(a), 1(b), and 1(c), and Figures 2(a), 2(b), and 2(c), respectively. e horizontal axis represents the respective parameter settings while the vertical axis shows the obtained logarithm value of mean FEs. As shown in Figures 1 and 2 Table 3. Results shown in bold indicate the final selected parameters. When fixing I GS value at 40, I NPD � 100 significantly outperforms I NPD � 50 in 5 functions and is outperformed by I NPD � 50 in 1 function, while other the 9 functions have no significant difference. I NPD � 100 significantly outperforms I NPD � 200 in 7 functions and is outperformed by I NPD � 50 in 2 functions, which indicate that the best overall optimization performance is I NPD � 100. When I NPD is fixed at 100, I GS � 40 significantly outperforms I GS � 20 and I GS � 80 in 2 and 3 functions, respectively. ere is no significant difference in other functions, which means that the best optimization performance is I GS � 40. As shown in Table 3, MPCE-SSALS with I NPD � 100 and I GS � 40 significantly outperforms the other parameter settings. It is also observed that the best parameter values of different test functions may be different; for example, the best parameter for f3 and f6 is I NPD � 200 and I GS � 40. is suggests that it is a good choice to set the parameter values to 100 and 40 in general, but for a specific problem, better parameter values can be determined by experiments.
In addition, when I NPD � 100 and I GS � 40, the FEs of global search and local search are 12524 and 2987476, respectively, which indicates that the proposed algorithm is mainly based on local search.
To compare these algorithms' performances on the CEC′2013 function suite, the average ranking of each algorithm was calculated. For a fair comparison, the other four algorithms' experimental data and supplementary material 8 Computational Intelligence and Neuroscience are directly taken from their original papers. Wilcoxon signed-rank test (significance level � 0.05) is utilized for pairwise comparison of these five algorithms. e place of each algorithm on each function is calculated according to the Wilcoxon rank tests. e comparison results and ranking are listed in Table 4. e best results for each benchmark function are distinguished by bold font. As shown in Table 4, MPCE & SSALS is the one with the best performance among these algorithms. SHADE-ILS and IMLSHADE-SPA rank second and third, respectively.         As shown in the figure, the results could be summarized as follows:
e convergence rate of MPCE & SSALS, in general, is faster than that that of MLSHADE-SPA and IMLSHADE-SPA, but it has a similar convergence rate to CBCC-RDG3 and SHADE-ILS. However, MPCE & SSALS is the simplest one among these algorithms.

Results Discussion.
e excellent results of MPCE & SSALS mainly benefit from the following factors: (1) e multiparent strategy used in this paper enables each offspring to inherit genes from multiple excellent individuals. It not only increases the offspring diversity, but also moves the algorithm quicker towards better solutions. (2) SSALS effectively improves MTS-LS1, which enhanced the local search performance significantly. Each dimension has its own basic step size that can be adjusted to accommodate the different effects of each dimension on the function. In addition, the minimum step size affects the search accuracy. If a large number of high-precision searches (very small step size) are carried out in the early stage of the algorithm, it will waste an amount of calculation and easily fall into the local minima. According to the current search results, gradual improvement to the search accuracy can avoid excessive search in the early stage of the algorithm. Similarly, in the late stage of the algorithm, the most promising position can be searched with high precision. (3) More local searches are performed in the algorithm. e proposed algorithm performs much more local searches than global searches.
is enhances the exploitation capability of the algorithm in the search space.
(4) Population decrease strategy is used in MPCE & SSALS. At the beginning of the algorithm, a large population size is conducive to improving the exploration ability. With the optimization process, individual differences are minimized, and the advantages of a large population are also reduced.
Gradually reducing the population size is helpful to enhance the exploitation ability.
(5) e memetic algorithm framework is used to combine the multiparent strategy, SGSO and SSALS, to work together. e memetic algorithm framework balances the exploration ability of global search and the exploitation ability of local search; thus, it has been widely used in LSGO problems. e SGSO algorithm also performs well in LSGO problems.

Conclusions
In this paper, a memetic algorithm MPCE & SSALS based on multiparent crossover evolution and step-size adaptive local search is proposed for LSGO problem. e MPCE strategy is used for global exploration, and the SSALS method is applied for local exploitation. In the early stage of algorithm execution, the global search and the local search are performed interchangeably, and the population size is gradually reduced to 1. In the later stage, only the local search is executed to improve the final solution. Local search is performed during the whole process, and the execution times of local search is far more than that of global search. A set of 15 benchmark functions was used to evaluate the performance of the MPCE & SSALS algorithm. According to the experimental data, the overall performance of the MPCE & SSALS algorithm performs better than the other four state-of-the-art algorithms.
e experimental results also indicate that the performance of SSALS is significantly higher than that of MTS-LS1, and the local search-dominated hybrid algorithm can effectively solve the LSGO problem.
On the other hand, the experiment analysis reveals that the multiparent crossover strategy can only improve the optimization effect of certain test functions while having no discernible impact on others. Among the four parents in the crossover operation, three individuals are selected from the previous generation of the population. e source of parents is relatively single. e advantages of multiparenting were not fully used. In the future, it is possible to add new parent generation methods, such as using PSO to generate one of the parents. is paper demonstrated that multiparent crossover evolution combined with local search is an effective algorithm framework to address the LSGO problem. A possible extension to this paper is to examine new parent generation techniques or local search strategies that improve the algorithm's performance.
Data Availability e source code and experimental data of MPCE & SSALS can be requested from yydzhwf@xnu.edu.cn.

Conflicts of Interest
e authors declare that there are no conflicts of interest regarding the publication of this paper.