A Hybrid Cellular Genetic Algorithm for the Traveling Salesman Problem

The traveling salesman problem (TSP), a typical non-deterministic polynomial (NP) hard problem, has been used in many engineering applications. Genetic algorithms are useful for NP-hard problems, especially the traveling salesman problem. However, it has some issues for solving TSP, including quickly falling into the local optimum and an insuﬃcient optimization precision. To address TSP eﬀectively, this paper proposes a hybrid Cellular Genetic Algorithm with Simulated Annealing (SA) Algorithm (SCGA). Firstly, SCGA is an improved Genetic Algorithm (GA) based on the Cellular Automata (CA). The selection operation in SCGA is performed according to the state of the cell. Secondly, SCGA, combined with SA, introduces an elitist strategy to improve the speed of the convergence. Finally, the proposed algorithm is tested against 13 standard benchmark instances from the TSPLIB to conﬁrm the performance of the three cellular automata rules. The experimental results show that, in most instances, the results obtained by SCGA using rule 2 are better and more stable than the results of using rule 1 and rule 3. At the same time, we compared the experimental results with GA, SA, and Cellular Genetic Algorithm (CGA) to verify the performance of SCGA. The comparison results show that the distance obtained by the proposed algorithm is shortened by a mean of 7% compared with the other three algorithms , which is closer to the theoretical optimal value and has good robustness.


Introduction
Traveling salesman problem (TSP) is one of the most common combinatorial optimization problems, and it has been widely used in this field. At the same time, many problems in the field of combinatorial optimization can be formulated as special TSP instances [1]. For example, resource optimization in the shortest path and shop scheduling problems are the most frequent TSP applications [2][3][4]. In Word Sense Disambiguation (WSD), problems can also be solved by describing WSD as a TSP variant [5]. In the global positioning system (GPS), TSP can also be used as the basis for finding route problems [6]. It can be seen that the application of TSP is extensive; however, TSP is an NP-hart problem, and the number of feasible solutions increases significantly with the increase of nodes, which brings great difficulty to the solution [7]. If the exact algorithm is used to solve TSP, it will take a long time, so the feasibility of using the exact algorithm to solve TSP is very low. On the other hand, although using an approximate solution to solve TSP problems cannot guarantee an optimal solution, it can reach a satisfactory solution in a very short time [8]. us, with the development of heuristic algorithms [9][10][11][12][13][14], many experts and scholars have begun to apply heuristic algorithms to solve TSP, which provides new ideas for solving TSP [15][16][17][18][19][20][21].
Genetic algorithm (GA) is an evolutionary algorithm proposed by Professor John H. Holland of the University of Michigan and his students from the late 1960s to the early 1970s [22]. GA uses biological evolution and genetic principles to imitate the generation and evolution process of all life and intelligence by drawing on the basis of biology. It is a bionic algorithm in a macroscopic sense and is applied to many optimization fields [23][24][25][26].
e coding method commonly used in genetic algorithms is binary coding, but most search space sequences do not correspond to feasible travel when binary coding is used. We can use ordinal coding and a partially mapped crossover (PMX) as the cross operator so that all the solutions obtained will be feasible solutions [27]. However, there are some disadvantages in using traditional GA to solve TSP, such as poor searchability and low convergence accuracy. On this basis, the improved GA also developed vigorously. Liu et al. [28] proposed an improved GA with decision cut-off algebra to solve TSP, which improves the convergence of the GA. Besides, Yu et al. [29] proposed an improved GA for solving TSP by a greedy algorithm, which used the greedy algorithm to generate the initial population and reduce the number of iterations. At the same time, the hybrid algorithm has also received the attention of many experts and scholars. For instance, Wang et al. [30] combined GA and SA, and proposed an improved simulated annealing genetic algorithm for solving TSP, improving the local search capabilities. Also, Zhang et al. [31] combined the GA and Firefly algorithm proposing a Firefly GA algorithm to solve TSP and prevent the algorithm from falling into local optimality. Moreover, Tao et al. [32] combined the GA and Ant Colony (AC) algorithm proposing a dynamic Ant Colony genetic algorithm to solve TSP, which improved the reinsertion of offspring to improve stability.
Cellular genetic algorithm (CGA) is a hybrid of cellular automata and genetic algorithms. It has been applied to various decision-making problems [33]. However, the application of the CGA to TSP is very limited at present. To study the application of the CGA in TSP and improve GA's optimization performance, this paper proposes a hybrid Cellular Genetic Algorithm with SA (SCGA). We improve TSP optimization performance by combining the cellular genetic algorithm and the simulated annealing algorithm. e contributions of this paper are as follows: (i) e paper successfully applies the cellular genetic algorithm to solve TSP and proposed a hybrid Cellular Genetic Algorithm combined with a simulated annealing algorithm. e experimental results show that the hybrid cellular genetic algorithm's optimization performance is better than GA, SA, and CGA. (ii) e optimization performance of the three cellular automata rules in the SCGA is examined through experiments and concluded that cellular automata rules with a moderate total number of living cells are beneficial to algorithm optimization. e rest of this article is arranged as follows. Section 2 is a brief introduction to the preliminary knowledge that will be used, including the traveling salesman problem model, the basic content of genetic algorithm (GA), simulated annealing (SA) algorithm, and cellular automata (CA). In Section 3, the proposed algorithm and its steps are explained in detail. Section 4 reveals the simulation results and discussions. Section 5 summarizes the full text and proposes the next step.

Traveling Salesman Problem (TSP)
. TSP is a commonly used test benchmark for optimization methods, which can be simply described as follows.
Given the coordinates of n cities, starting from any city, passing through the remaining cities only once, and finally returning to the starting city, there are n! routes. We need to choose the shortest route from these n! routes. In the symmetric Euclidean TSP, city A''s distance to city B is equal to the distance from city B to city A. According to the description of TSP, the following mathematical model can be established [29]: We use the city number as the city mark. e solution (1, 2, 3, . . . , n) represents a trip starting from city 1, along city 2, city 3, until city n, and finally back to city 1. (N 1 , N 2 , . . . , N n ) is a permutation of (1, 2, 3, . . . , n), and d(N i , N i+1 ) represents the Euclidean distance between city N i and city N i+1 .

Genetic Algorithm (GA).
Genetic algorithm is one of the universal algorithms in the optimization field. e essence of GA is an efficient, parallel, and global search method. It can automatically acquire and accumulate search knowledge during the search process and adaptively control the search process to find the optimal solution [27,34]. e traditional GA is mainly composed of population and population size, coding method, selection strategy, genetic operator, and stopping criterion [35][36][37]. e population is composed of chromosomes or individuals, and each individual corresponds to a feasible solution to the problem. e population size refers to the number of individuals in the population, which is a subset of the problem's feasible region. Generally speaking, the larger the number of individuals in the population is, the more feasible solutions it contains, and the better the optimal solution will be. However, as the number of individuals in the population increases, the algorithm's computation time will also increase, which affects the performance of the algorithm. erefore, it is necessary to select an appropriate population size. e coding method is also called the gene expression method. e chromosomes in the population are composed of genes. Correctly coding the chromosomes is the basic and important work of GA. According to the mathematical model of TSP, we can choose the ordinal encoding as the encoding method, and each chromosome is composed of a nonrepetitive ordinal number (1, 2, 3, . . . , n). e selection strategy is the process of selecting parents from the current population. We can use the commonly used roulette as the selection operator. e pseudocode of the roulette operator is shown in Algorithm 1, and the steps of the roulette operator are as follows: Step 3. Calculate the cumulative probability of an individual P k : (iv) Step 4. Randomly generate the number rand in [0, 1]; if P i-1 < r and < P i , select the i th individual. Repeat NP times to get the selected NP individuals Genetic operators mainly include crossover and mutation, which is the most important part of the GA. We choose partially mapped crossover (PMX) and reverse order mutation to hide the infeasible solutions in the execution process. Reverse order mutation refers to reversing the order of the numbers between two tangent points. erefore, the numbers will not change, and there will be no duplicate numbers. Figure1 is a schematic diagram of PMX, and the steps of PMX are as follows: Step 1. Randomly generate two cut points.
Step 2. Exchange the part between two individual tangent points.
Step 3. Determine the mapping relationship.
Step 4. Restore the legitimacy of the individual according to the mapping relationship. e stopping criterion generally uses the maximum number of iterations, and the algorithm stops when the maximum number of iterations is reached. Algorithm 2 shows the pseudocode of the GA for solving TSP.

Simulated Annealing (SA) Algorithm.
In 1953, Metropolis et al. [38] first proposed the simulated annealing algorithm (SA). In 1983, Kirkpatrick et al. [39] successfully used SA in combinatorial optimization problems, especially large-scale problems. SA is a general random search algorithm, which is an extension of the local search algorithm. It is derived from the simulation of the annealing process in thermodynamics [40]. Because modern SA can effectively solve NP complexity problems, avoid falling into local optima, and overcome initial value dependence, it has been widely used in engineering, production scheduling, control engineering, machine learning, and other fields [41][42][43]. SA mainly includes state expression, neighborhood definition, thermal equilibrium, and cooling control [39]. State expression is to describe the system's energy state in a mathematical form, and one state corresponds to one solution of the problem. State expression is the main work of SA, which directly determines the neighborhood structure and size. A reasonable state expression method can reduce computational complexity and improve algorithm performance. Consistent with genetic algorithms, nonrepeated ordinal numbers are used to represent solutions. e neighborhood definition's starting point is to ensure that the neighborhood's solutions are distributed as much as possible in the entire solution space. e nature of the problem usually determines the definition of the neighborhood. Consistent with the mutation operator in GA, we use reverse order to form neighbor solutions. To replace the neighbor solution with the original solution is determined by the Metropolis criterion. Algorithm 3 shows the pseudocode of the Metropolis criterion. e thermal equilibrium is equivalent to the isothermal process in the physical annealing, which is the internal cycle process in SA. To ensure the equilibrium state, the number of internal cycles should be large enough, leading to an increase in computation time. e number of internal cycles is the length of the Markov chain L. Cooling control is the outer cycle process of SA. Unlike GA, SA uses a temperature drop to control the iteration of the algorithm.
e algorithm stops when the temperature drops to the end temperature. e pseudocode of SA for solving TSP is shown in Algorithm 4, and the steps of SA for solving TSP are as follows: Step 1. Set initial parameters (including initial temperature, Markov chain length, cooling rate, and termination temperature.), and randomly generate an initial solution.
Step 2. Using the reverse order method to generate a neighbor solution.
Step 3. Move the neighbor according to Metropolis criterion.
Step 4. Judge whether the thermal equilibrium is reached; if the thermal equilibrium is not reached, go to Step 2.
Step 5. Lower the temperature. If the temperature reaches the end temperature, the algorithm will stop; otherwise, go to Step 2.

Cellular Automata (CA).
In 1948, Stanislaw M.Ulam and von Neumann proposed cellular automata (CA). CA is a simplified mathematical model for describing complex phenomena in nature [44]. CA reflects the law of selfreplication of life through relatively simple rules. e realization of self-replication of life on the computer has brought a new revolution in computing and simulation fields. As a discrete model of a complex system, CA is an important experimental method for studying dynamic interaction and spatiotemporal evolution [45][46][47]. CA is composed of four parts: cell, cellular space, cellular neighbor' type, and cellular automata rule [44]. e choice of cellular neighbor's type and cellular automata rule is a crucial part that affects CA's performance. Here, the Moore neighborhood is utilized. e Moore neighborhood consists of a central cell (the one which is to be updated) and its eight geographical neighbors, north, west, south, east, north-east, northwest, south-west, and south-east, that is a total of nine cells [48]. Each central cell can be alive (s t � 1) or dead (s t � 0). If permutations and combinations are used to describe neighbors' states, many unnecessary duplicate data will be generated. erefore, there is no need to enumerate the states and positions of specific neighbors; just calculate the total number of cellular states in neighbors [49]. e sum of neighbors' states is K; so, K can be any value between 0 and 8. Cellular automata rules [50] are as follows:

Roulette operator Begin
Input: population, selection probability P c Calculate NP, f k , P k , (k � 1, 2, . . . , n) for i � 1 to NP if rand < P i and rand > P i-1 select the i th individual End if End for Output: the selected individual. End ALGORITHM 1: e pseudocode of the roulette operator.
Genetic algorithm (GA) for solving TSP Begin Input: city coordinates: . . , y n . selection probability P s , crossover probability P c , mutation probability P m , the maximum number of iterations (Maxiter) Initialization: using ordinal coding to generate the initial population, number of iteration k � 1 while k ≤ Maxiter Selection Crossover Mutation New population Save the optimal result in the iteration k � k + 1 End while Output: the optimal result: R � N 1 , N 2 , . . . , N n , f(R) End ALGORITHM 2: e pseudocode of genetic algorithm (GA) for solving TSP.

Metropolis criterion Begin
Input: city coordinates: End if Output: S 1 End ALGORITHM 3: e pseudocode of the Metropolis criterion. 4 Mathematical Problems in Engineering Rule 1: Using different cellular automata rules, the total number of living cells in the cellular space is different, and the optimization performance will also be affected.

A Hybrid Cellular Genetic Algorithm (SCGA)
is section describes the main contents of SCGA in detail, including the dynamic evolution of cellular space, genetic operations, elitist strategy, and the steps of SCGA.

Dynamic Evolution of Cellular Space.
A state matrix in the cellular space is composed of each cell state in the cellular space. In the iteration, each cell in the cellular space is used as the central cell, and the state of the central cell is updated using the cellular automata rules to form a new state matrix. e pseudocode of the dynamic evolution of the cellular space is shown in Algorithm 5. Given a 10 × 10 cellular space arbitrarily, each cell state is randomly generated to form an initial state matrix (an n-th order square matrix N composed of 0 and 1), as shown in Figure 2. e cellular automata rule determines each cell's updated state, for example, if the cell in the third row and the third column is the central cell. e central cell state is 1, and there are four cells in the neighboring cells with states equal to 1. According to rule 1, the central cell update state will be 0, while the central cell updated state will be 1, according to rule 2 or rule 3. It can be seen that the updated cellular state changes according to different cellular automata rules. Because dead cells do not perform genetic operations, we only consider the number of living cells. Figure 3 shows that when different cellular automata rules are selected, the total number of living cells will differ. erefore, if rule 1 is chosen, the total number of living cells will eventually remain at about 10%, while if rule 2 is selected, the total number of living cells will eventually remain at about 38%. On the other hand, if rule 3 is selected, the total number of living cells will eventually remain in a range of 50%. e total number of living cells in the cellular space will affect the optimization performance of the algorithm. Aiming at the TSP, we will compare the optimal solutions obtained by applying these three rules in the hybrid cellular genetic algorithm through experiments.

Genetic
Operations. SCGA puts the individual as a cell into the cellular space, selects the living cell, and forms the parents with the best cell among neighbors. en, the parents perform crossover and mutation operations. If the cellular state is dead, no crossover and mutation operations are performed. Algorithm 6 shows the pseudocode for the genetic operations.
Assuming that the number of cities is 10 and the number of populations is 100, the corresponding cellular space is a 2D grid of 10 × 10 units. e algorithm works as follows: (i) Randomly generate 100 individuals to form the initial population and associate the individuals with the cells one-to-one (ii) If Figure 2 is used as the state matrix of the cellular space at the initial moment, the individual is represented by the letter. e number after the letter represents the state of the cell, as shown in Figure 4 Simulated annealing (SA) algorithm for solving TSP Begin Input: city coordinates: X � x 1 , x 2 , . . . , x n , Y � y 1 , y 2 , . . . , y n . Initial temperature T 0 , Markov chain length L, cooling rate R, and termination temperature T t Initialization: using ordinal coding to generate the initial solution S, current temperature T � T 0 while T ≤ T t for i � 1 to L Generate a neighbor solution S 1 Metropolis criterion End for Save the optimal solution in the iteration T � T * R End while Output: R � N 1 , N 2  Firstly, the parents can get the offspring individuals O1 and O2 after PMX. Perform the reverse mutation on O1 and O2 to get the individuals O3 and O4, as shown in Table 3. e new individuals obtained after genetic manipulation are O3 and O4; the best progeny is O3, and O3 is better than V; so O3 replaces V to complete the genetic operation.

Elitist Strategy.
e living central cell and the best neighbor get offspring after crossover and mutation operations. If the distance between the optimal progeny and central cell is not compared, and the optimal progeny directly replaces the central cell, the inferior solution may enter the population. If the central cell's distance is better than the optimal progeny distance, and the central cell is not replaced, it may fall into a local optimum. In order to solve End if End if End for Output: New square matrix N End ALGORITHM 5: e pseudocode of the dynamic evolution of the cellular space. 6 Mathematical Problems in Engineering Genetic operations Begin Input: An n-th order square matrix N composed of 0 and 1, cellular space, city coordinates: X � x 1 , x 2 , . . . , x n , Y � y 1 , y 2 , . . . , y n Calculate the total distance f i (R) of each individual in the cellular space by Equation (2) for i � 1 to n 2 if N(i) � 1 Select the i th individual and the individual with smallest total distance among the 8 neighbors Crossover Mutation New the i th individual End if End for Output: New cellular space End ALGORITHM 6: e pseudocode for genetic operations.     the above problems, SCGA sets an elite individual retention strategy after the crossover and mutation operations. When the optimal progeny's obtained distance after the crossover and mutation operation is less than the distance of the center cell, the center cell is directly replaced with the optimal progeny. Otherwise, after performing the simulated annealing operation on the optimal progeny, new offspring are obtained, and the central cell is replaced with the new offspring. Algorithm 7 shows the pseudocode of the elitist strategy. rough the elitist strategy, the living central cell can be optimized as much as possible to avoid falling into the local optimum.

3.4.
Process of SCGA. SCGA, as explained before, has three main contents: the dynamic evolution of cellular space, genetic operations, and elitist strategy. In this section, we will introduce the main steps of applying SCGA to solve TSP. Algorithm 8 shows the pseudocode of the proposed hybrid Cellular Genetic Algorithm with SA (SCGA) and its required steps for the TSP solution. (v) Step 5. Perform crossover and mutation operations on the parents to get the offspring. (vi) Step 6. If the distance of the optimal progeny is longer than the distance of the central cell, the simulated annealing algorithm is used to optimize the optimal progeny and replace the central cell after optimization. Otherwise, the optimal progeny will directly replace the central cell.

Experimental Results and Discussion
Here, instances in the TSPLIB test library are selected, and GA, SA, CGA, and SCGA (cellular automata rules 1-3) are used to solve these instances. All of the proposed algorithms are implemented using MATLAB R2018b using a laptop equipped with Windows 10 Ultimate 64 bit operating system, processor Intel Core i7-10510U CPU, 12 GB of RAM, and 1.80 GHz processor speed.

Experiments
Settings. 13 instances from TSPLIB are selected for simulation experiments. To avoid the influence of randomness, the results of the experiments on each algorithm for each instance were taken independently over 10 repeated times. e result is expressed by the route distance obtained from the experiment. e basic parameters of SCGA include crossover probability P c � 0.9, mutation probability P m � 0.05, cooling rate r � 0.9, and initial temperature T 0 � 100. e other parameters are shown in Table 4, where Num, Num. C, Maxiter, L, and T t represent population size, the size of cellular space, maximum iteration times, Markov chain length, and termination temperature values which are identified. e basic parameters of GA are initialized as the population size Num � 100, generation gap GGAP � 0.9, and maximum iteration times Maxiter � 10000. e basic parameters of SA are also initialized as Markov chain length L � 500 and termination temperature Tend � 0.001. e basic parameters of CGA are assigned initial values as mutation probability P m � 0.5 and maximum iteration times Maxiter � 10000. e other parameters that are not mentioned are the same as those of SCGA.

Comparison of Cellular Automata Rules.
In order to test the optimization of the three cellular automata rules, the experiment is performed using rule 1, rule 2, and rule 3 in SCGA using the same parameters applied in the previous experiments. Figure 5 shows the optimal paths of eil51, pr107, bier127, and kroa200. Table 5 shows the best solutions, the worst solutions, the mean value, and the standard deviation of 10 repeated experiments. e boldface in the table represents the optimal value. As can be seen, in the instances of att48, berlin52, chn31, oliver30, pr107, and pr152, the best solutions obtained using rule 1, rule 2, and rule 3 are equal. In the instances of bier127, ch130, and kroa100, the best solutions obtained using rule 2 and rule 3 are also equal, and both are better than the best solutions obtained using rule 1. In the instances of eil51 and eil76, the best solutions obtained using rule 2 are better than the best solutions obtained using rule 1 and rule 3. Only in the instance of pr136, the best solutions obtained by using rule 1 and rule 3 are better than the best solutions obtained using rule 2. In the instance of kroa200, the best solution obtained using rule 3 is better than the best solution obtained using rule 2. In most instances, the worst solutions obtained using rule 2 are better than the worst solutions obtained using rule 1 and rule 3. Only in the instances of kroa100 and pr136, the worst solutions obtained using rule 1 and rule 3 are better than the worst solutions obtained using rule 2. At the same time, in 10 repeated experiments, the mean value and the standard deviation computed using rule 2 are better than the mean value and the standard deviation computing using rule Elitist strategy Begin Input: Central cell S 1 and new individual S 2 city coordinates: X � x 1 , x 2 , . . . , x n , Y � y 1 , y 2 , . . . , y n Calculate f(S 1 ) and f(S 2 ) by Equation (2) if Optimize S 2 using simulated annealing algorithm S 1 � S 2 End if Output: S 1 End ALGORITHM 7: e pseudocode of elitist strategy. e hybrid cellular genetic algorithm (SCGA) for solving TSP Begin Input: city coordinates: X � x 1 , x 2 , . . . , x n , Y � y 1 , y 2 , . . . , y n . P c , P m , r, T 0 , Num, Num. C, Maxiter, L, T t Initialization: using ordinal coding to generate randomly the initial population as cellular space, generate randomly an n-th order square matrix N composed of 0 and 1 as state matrix, number of iteration k � 1 while k ≤ Maxiter for i � 1 to n 2 if N(i) � 1 Genetic operations Elitist strategy End if Save the optimal result in the iteration End for k � k +1 End while Output: the optimal result: R � N 1 , N 2 , . . . , N n , f(R) End ALGORITHM 8: e pseudocode of the hybrid cellular genetic algorithm (SCGA) for solving TSP.  The optimal path of kroa200 (d) Figure 5: e optimal paths: (a) eil51; (b) pr107; (c) bier127; (d) kroa200. 1 and rule 3. It can be seen that, in most instances, a better and more stable optimal solution can be obtained by using rule 2 in SCGA. In other words, the optimization performance of cellular automata rules with a moderate total number of living cells is giving better results than other rules.

Comparison of Other Algorithms.
In order to verify the optimization of SCGA, SCGA, CGA, GA, and SA are applied to the instances. e experimental results are shown in Tables 6-8. As shown in Table 6, based on the instances of kroa200, bier127, pr136, and ch130, the distance of the SCGA best solutions is shorter than the distance of the GA best solutions by 9.73%, 8.64%, 8.25%, and 5.80%, respectively. Besides, based on the instances of kroa200, pr136, bier127, eil76, and ch130, the distance of the SCGA best solutions is shorter than the distance of the SA best solutions by 15.79%, 8.86%, 6.69%, 8.25%, 5.14%, and 5.02%, respectively. In the instances of kroa200 and pr136, the distance of the best solutions obtained by using SCGA is shorter than the distance of the best solutions obtained by using CGA by 7.42% and 5.43%, respectively. e SCGA best solutions are better than the GA best solutions in these 13 instances. Only in the instances of oliver30 and chn31, the distance of the SCGA best solutions is equal to the CGA best solutions' distance. Besides, in the instance of oliver30, the SCGA best solution's distance is equal to the distance of the SA best solution. Moreover, the SCGA worst solutions are better than the GA, SA, and CGA worst solutions in these 13 instances.
is paper uses SCGA to solve these instances and accounts for four decimal places. So, there will be a slight error when comparing with the known optimal solution. In Table 8, the calculation equations for the best solution error and the mean value error are as follows: It is shown in Table 8, in the instances of chn31, pr107, and pr152, that the SCGA best solution errors are about 0%. Only in the instances ch130 and eil76, the SCGA best solution errors are greater than 1%.
From Tables 6-8, we conclude the following. e optimal solution, the worst solution, the mean value, and the standard deviation obtained by solving the TSP instances using our proposed algorithm (SCGA) are better than GA, SA, and CGA. Using SCGA to solve the TSP instances gives the closest values to the theoretical optimal value. Figure 6 shows the box plots of 10 independents runs of instances kroa100, pr152, att48, and eil76. e box plots of SCGA always locate the lowest position in four instances. Furthermore, these box plots of SCGA are close to the straight lines, which indicate that SCGA has the smallest    The results of ten experiments of pr152 (b) Figure 6: Continued. degree of the difference between the best value, the mean value, the worst value, and the median value. As mentioned before, SCGA significantly outperforms GA, SA, and CGA.

Conclusions
In order to solve the TSP effectively, a hybrid Cellular Genetic Algorithm with SA (SCGA) is proposed. e algorithm combines the principles of GA, CA, and SA to improve the optimization GA's performance. Experimental results show that SCGA is superior to GA, SA, and CGA in solving the TSP. It proves that SCGA has high optimization performance and robustness when solving the TSP. We studied the optimization performance of three cellular automata rules for the TSP solution. In addition to the three cellular automata rules studied in this paper, other cellular automata rules may have higher optimization performance. We combined the simulated annealing algorithm and the cellular genetic algorithm to enhance the TSP solution's optimization performance. In addition to the simulated annealing algorithm, the cellular genetic algorithm was combined with other algorithms and effectively enhanced the TSP solution's optimization performance. Common traveling salesman problems include symmetric traveling salesman problem, asymmetric traveling salesman problem, multiperson traveling salesman problem, and multitarget traveling salesman problem. is paper studied only the symmetric traveling salesman problem based on the Euclidean distance. e optimization performance of the asymmetric traveling salesman problem and other traveling salesman problems has not been verified. In addition, it is not clear how to apply the hybrid cellular genetic algorithm to other optimal combination problems, such as knapsack problems, production scheduling problems, and clustering. ese issues are worthy of our further discussion, and they will be the focus of our future work.

Data Availability
Datasets used to support the findings of this study are available from http://comopt.ifi.uni-heidelberg.de/software/ TSPLIB95/XML-TSPLIB/instances/.