Wild Geese Migration Optimization Algorithm: A New Meta-Heuristic Algorithm for Solving Inverse Kinematics of Robot

This paper proposes a new meta-heuristic algorithm, named wild geese migration optimization (GMO) algorithm. It is inspired by the social behavior of wild geese swarming in nature. They maintain a special formation for long-distance migration in small groups for survival and reproduction. The mathematical model is established based on these social behaviors to solve optimization problems. Meanwhile, the performance of the GMO algorithm is tested on the stable benchmark function of CEC2017, and its potential for dealing with practical problems is studied in five engineering design problems and the inverse kinematics solution of robot. The test results show that the GMO algorithm has excellent computational performance compared to other algorithms. The practical application results show that the GMO algorithm has strong applicability, more accurate optimization results, and more competitiveness in challenging problems with unknown search space, compared with well-known algorithms in the literature. The proposal of GMO algorithm enriches the team of swarm intelligence optimization algorithms and also provides a new solution for solving engineering design problems and inverse kinematics of robots.


Introduction
e rapid development of informational and intelligent technology has spawned many new intelligent application requirements. It has also led to many new optimization problems with nonlinearity, complexity, and constraints in engineering, science, economics, management, and other fields. Traditional optimization methods have been unable to meet the needs of computing, and seeking efficient optimization algorithms has become a research hotspot in related disciplines [1][2][3]. e meta-heuristic algorithms are widely used to solve optimization problems due to the advantages of simplicity, flexibility, and derivation-free mechanism [4][5][6]. e algorithm is based on mathematics and finds the best possible solution from all candidate solutions through an iterative calculation mechanism [7,8].
Most of the meta-heuristic algorithms are inspired by the social nature of biological swarms, the laws of natural phenomena, and human intelligence. In general, the algorithms are mainly divided into three categories. e algorithms based on the laws of natural phenomena can be divided into evolutionary laws and physical laws. e evolution-based algorithms mainly include genetic algorithm (GA) [9], differential evolution algorithm (DE) [10], black hole algorithm (BH) [11], natural aggregation algorithm (NAA) [12], barnacles mating optimizer (BMO) [13], biogeography-based optimization (BBO) [14], bird mating optimizer (BMO) [15], and so on. Among them, GA algorithm is inspired by Darwin's theory of evolution. Each individual in the algorithm is assigned a specific gene, and the iterative optimization process is achieved by the genetic evolution of individual genes. NAA algorithm is inspired by the collective decision making intelligence of the groupliving animals. Individuals will make decisions about entering/leaving a subpopulation by the quality and crowding of the subpopulation to achieve localization and generalization search for the problem space. e physicsbased algorithms mainly include simulated annealing algorithm (SA) [16], central force optimization algorithm (CFO) [17], electromagnetic field optimization algorithm (EFO) [18], water evaporation optimization algorithm (WEO) [19], gravitational search algorithm (GSA) [20], and so on. e algorithms based on human social behavior mainly include teaching-learning-based optimization algorithm (TLBO) [21], student psychology-based optimization algorithm (SPBO) [22], social-based algorithm (SBA) [23], and so on. e kho-kho optimization (KKO) algorithm [24] and battle royale algorithm (BRO) [25] are inspired by players' rules in the games.
At present, the most studied algorithm is based on biological swarm behavior, which is also called swarm intelligence optimization algorithm. e algorithms mainly include particle swarm optimization algorithm (PSO) [26], bat-inspired algorithm (BA) [27], artificial bee colony algorithm (ABC) [28], fruit fly optimization algorithm (FOA) [29], migrating birds optimization (MBO) [30], cuckoo search algorithm (CS) [31], cuttlefish algorithm (CFA) [32], ant colony optimization algorithm (ACO) [33], moth-flame optimization algorithm (MFO) [34], mayfly optimization algorithm (MA) [35], chicken swarm optimization algorithm (CSO) [36], naked mole-rat algorithm (NMR) [37], and so on. Among them, the PSO algorithm is inspired by the social behavior of bird swarm. Each particle continuously explores the solution space in this algorithm to find the global optimum. e position update strategy is based on the historical optimal position and the global optimal position of each particle. e inspiration of the MBO algorithm comes from the V flight formation during the migration of birds. e position update is implemented sequentially from the optimal value, and the position of current individual is compared with its neighbors. If the fitness of the neighbor is better, the current individual will be replaced. e CS algorithm is a meta-heuristic algorithm based on the cuckoo's brood parasitic behavior and the bird's Lévy flight behavior. e algorithm is to search for the global optimal solution through the strategy of Lévy flight and random walk. e CFA algorithm is inspired based on the colour changing behavior of cuttlefish. e population will be divided into four independent groups in the algorithm, and an independent search strategy is designed for each group by simulating the two processes of reflection and visibility. e meta-heuristic algorithm is proposed not only for theoretical research in the laboratory, but more importantly, it is hoped to achieve satisfactory results in different practical application fields. e research of many algorithms is based on specific practical applications and explores their excellent computational performance. For instance, Taymaz proposed the BRO algorithm [25] and applied it to solve the inverse kinematics problem of the PUMA560 robot. e research shows that the BRO algorithm achieves excellent results in the position solution. Amir et al. proposed the CS algorithm [31] and verified its excellent performance through 13 engineering design problems. Seyedali proposed the ant lion optimizer (ALO) [38] and applied it to the design of ship propellers. e smooth blade shape is found through the ALO algorithm to improve the propeller efficiency. Mirjalili et al. proposed the grey wolf algorithm (GWO) [39] and applied it to optimize the BSPCW structure in the optical buffer design problem. e optimized structure has a good bandwidth and does not require any frequency mixing. Seyedali proposed the sine cosine algorithm (SCA) [40] and applied it to the two-dimensional design of aircraft wings. Minimal drag is the goal of structural optimization. e optimization results show that the drag is reduced from 0.009 to 0.0061, and the effect is pronounced. Li et al. proposed the slime mold algorithm (SMA) [41] and verified the algorithm's performance on multiple benchmark functions and five practical engineering design problems. e SMA algorithm exhibits satisfactory computational performance in solving engineering problems. Kaur et al. proposed the tunicate swarm algorithm (TSA) [42] and applied it to the solution of constrained and unconstrained engineering problems. e applicability of the TSA algorithm is verified.
In order to mimic nature more effectively and improve the search performance of the algorithm [43], fitness-distance balance (FDB) proposed by Kahraman et al. [44] has made significant contributions, which combines FDB with the symbiotic organisms search algorithm (FDB-SOS). Compared with 13 meta-heuristic search (MHS) techniques, the excellent performance of the FDB-SOS algorithm is verified on 90 benchmark functions. Aras et al. [45] proposed an FDBSFS algorithm, which uses the FDB mechanism to optimize the stochastic fractal search algorithm. Compared with 39 MHS algorithms, it verifies the powerful search performance and the competitiveness of the FDBSFS algorithm, on 89 unconstrained benchmark functions and 5 constrained engineering problems. Ozkaya et al. [46] redesigned the mutation operator of the improved adaptive differential evolution (LSHADE) algorithm by the FDB mechanism, which is defined as the FDB-LSHADE algorithm. Compared with other 8 MHS algorithms, the FDB-LSHADE algorithm shows excellent performance on CEC14, CEC17, and energy hub economic dispatch problems. To achieve higher performance goals, the application range is wider. e researchers consider combining swarm intelligence algorithms with other deep learning methods. For instance, Ghasemi-Darehnaei et al. [47] proposed a swarm intelligence ensemble deep transfer learning method (SI-EDTL) and used the whale optimization algorithm (WOA) to select the optimal hyperparameters of SI-EDTL. Meanwhile, SI-EDTL is applied to multiple vehicle detection in unmanned aerial vehicle (UAV) images. Basha et al. [48] proposed an improved Harris hawks optimization algorithm to optimize the convolutional neural network (CNN) architecture. Compared with other similar methods, the network achieves superior performance in classifying various grades of brain tumors. Singh et al. [49] proposed a multistage particle swarm optimization (MPSO) algorithm to explore the CNN architecture and its hyperparameters (MPSO-CNN), which achieved better performance on 5 benchmark datasets. Hilal et al. [50] studied a remote sensing image classification model (FCMBS-RSIC) based on fuzzy logic and bird swarm algorithm and performed 2 Computational Intelligence and Neuroscience performance verification on benchmark open-access datasets. e FCMBS-RSIC model has enhanced results compared to other state-of-the-art methods. Zivkovic et al. [51] proposed a framework to improve the prediction accuracy of COVID-19 cases, which is an adaptive neuro-fuzzy inference system trained by an improved beetle antenna search algorithm. Kumar and Jaiswal [52] proposed a cognitivedriven analytics model (CNN-WSADT) for real-time data classification. It combines three deep learning methods of CNN, wolf-search algorithm, and decision tree.
With the efforts of the researchers, new meta-heuristic algorithms are proposed every year and applied to solve complex optimization problems in different fields. Each algorithm balances its exploitation and exploration process by setting up a unique search mechanism, which may be intrinsic to the success of the new algorithm [53][54][55]. However, no single meta-heuristic algorithm satisfies all optimization problems, as explained by the no-free-lunch theorem [56]. In other words, the same algorithm may achieve satisfactory results on one optimization problem but may exhibit poor computational performance on another. erefore, with the continuous innovation of science and technology, the complexity and challenge of optimization problems continue to increase. While improving traditional algorithms, researchers also need to propose new algorithms and theories.
is motivates us to propose a new metaheuristic algorithm, inspired by wild geese migration. ere is no prior study on this topic in the optimization algorithm literature to the authors' knowledge.
is paper describes a new meta-heuristic optimization algorithm (GMO). e algorithm simulates the social behavior of wild geese migration and designs multiple migration groups. e iterative process of the GMO algorithm mainly refers to the behavior of randomly establishing migration groups, synchronous migration, and free foraging. e random establishment of the migration group in the algorithm is that its members are randomly generated with the head goose (the best individual in the migration group) as the center. e synchronous migration means that individuals in each migration group update their positions in equal steps. e free foraging refers to individuals moving within a small random range. To evaluate the performance of the GMO algorithm, the simulation experiments are carried out by 29 stable benchmark functions in CEC2017. At the same time, the algorithm is applied to solve five engineering design problems and the inverse kinematics problem of 7R 6DOF robot and is compared with other algorithms reported in the literature. e results show that the computational performance of the GMO algorithm is more competitive, and it effectively solves practical engineering problems. e main contributions of this paper are as follows: (1) e development and latest research results of metaheuristic algorithms are analyzed through literature, which provides more theoretical basis and reference value for the new algorithm proposed in this paper. (2) is paper proposes a new swarm intelligence algorithm, named GMO algorithm, which is inspired by the social behavior of long-distance migration of wild geese swarm. In the algorithm, the search mechanism of randomly establishing migration groups, synchronous migration, and free foraging is designed, which effectively balanced the exploitation and exploration process in the search space. (3) Simulation experiments are carried out in the 29 stable benchmark functions of CEC2017, and each function is tested on 10, 30, 50, and 100 dimensions. e experimental results of GMO algorithm and 5 other algorithms are compared in detail. It is shown that the GMO algorithm has good convergence accuracy and speed, strong stability, and short running time.
(4) e GMO algorithm is applied to five engineering design problems in this paper. Compared with the results reported in other studies, the GMO algorithm has shown good results in the face of practical problems in different search spaces. e applicability and feasibility of the algorithm to solve engineering optimization problems are verified. (5) e GMO algorithm is used to solve the inverse kinematics problem of the 7R 6DOF robot. e results show that the GMO algorithm is better than other comparative algorithms in the solution of the inverse kinematic pose problem and has a higher solution accuracy. e algorithm provides a new method for solving the inverse kinematics problem of the robot. e rest of this paper is organized as follows. Section 2 presents the GMO algorithm and introduces its primary sources of inspiration and design principles. Section 3 gives the simulation experiment of the GMO algorithm by benchmark functions, comparing it with other algorithms to verify its computational performance. Section 4 is devoted to solving five engineering optimization problems using the GMO algorithm and proving the algorithm's applicability. Section 5 successfully solves the inverse kinematics problem of the 7R 6DOF robot through the GMO algorithm. Finally, the conclusion of this paper and directions for possible future research are given in Section 6.

GMO Algorithm
In this section, the inspiration for the GMO algorithm is first introduced to better understand the proposed methodology.
en, the mathematical model of the algorithm is provided, and its implementation flow and pseudocode are described. Finally, the time complexity analysis of the GMO algorithm is carried out.

Inspiration.
e wild goose is a general term for birds of the genus goose, and it is also an excellent air traveller. Every autumn, they fly in droves from Siberia to the south for the winter. e following spring, they will return to Siberia to lay eggs and breed after a long journey. In the migration process, each migration group consists of many geese, and the experienced head geese lead them to fly in line-shaped or Computational Intelligence and Neuroscience 3 V-shaped arrangement, as shown in Figure 1.
is is a miraculous natural phenomenon.
During the flight of the wild geese, wild geese generate vortices and updrafts by constantly flapping their wings. e wild geese that follow closely will fly in these air currents, saving a lot of energy. However, the head geese have no available updraft resources, and their physical energy will be consumed the fastest. erefore, to ensure the continuity of the air flight, each wild geese migration group needs to change formation and head geese frequently on long-distance flights. Meanwhile, the wild geese group migration is also conducive to exchanging information and avoiding natural enemies [57].

Algorithm Principles and Mathematical Models.
e GMO algorithm's initial population is randomly generated in the solution space, and a certain number of wild geese are selected as the initial head geese. e wild geese swarm migrate under the leadership of the head geese. e population size of the wild geese in the GMO algorithm is N, and the number of the head geese is M. e migration group initial radius size is set to L (L � u d − l d/N).

Formation of Migration Groups.
In each iteration process, the migration groups are reestablished according to the position of the head geese. e members of each group are randomly distributed within the radius L with the head goose as the center. Its purpose is to realize the replacement of the head geese and the transformation of the formation. e mathematical model is as follows: where x t i represents the position of the i-th individual at the t-th iteration (i � 1, 2, . . ., N). T is the maximum number of iterations (t � 1, 2, . . ., T). x t j represents the position of the jth head goose individual at the t-th iteration (j � 1, 2, . . ., M). b represents the number of migration groups (b � N/M).

Synchronized Flight.
During the migration process of the wild geese, the head geese in nature mainly rely on environmental information, historical memory, and flight experience to guide the migration. Meanwhile, each migration group member maintains a relatively fixed position to fly with the head goose. e synchronous flight strategy is used in the GMO algorithm to simulate the flight characteristics of wild geese, and the flight steps in the migration group members are set to be equal. e individuals' position update information in the migration group is derived from the head goose, which is mainly based on the optimal position and refers to the position information of other head goose.
e schematic diagram of the flight process of a migration group is shown in Figure 2, and the mathematical model is as follows: where x t best represents the global optimal individual and x t k is the randomly selected head goose individual. x t i and x t j represent the members and the head goose in a migration group, respectively. e flight step size c 1 ∈ [0, 1], and c 2 is calculated by where fit(j) is the fitness value of the head goose, fit worse , fit ave , and fit best represent the worst, average, and best fitness value of the head geese, respectively, and c 2 is mainly used to control the proportion of other head geese's experience information. If fit(j) ≤ fit ave , it indicates that the value of fit(j) is small and means that x t j is an excellent head goose and does not need to learn more information from other head goose. e exact opposite is true when fit(j) > fit ave .

Free Foraging.
Resting and foraging are inevitable for migratory groups during long-distance flights. Wild geese often choose lakes or larger bodies of water in nature as the foraging area. During the free foraging process, the migration group members will randomly explore according to the information of the head goose and maintain a certain connection in a small area. At the same time, the migration group maintains the movement trend by the optimal location information. After finishing foraging, the wild geese will regroup and migrate. A schematic diagram depicting the free foraging process is shown in Figure 3, and the mathematical model is as follows: where c 3 and c 4 are random numbers between [0, 1], respectively, used to control the movement step size of individuals during the foraging process. L is the radius of the group range, which is used to control the distance between the migration group members and the head goose.

Selection of the Head Geese.
During the long-distance migration of wild geese, the head geese are the most crucial individuals, and they are the leaders of the entire wild geese swarm. e head geese must be replaced frequently to achieve high flight durability. erefore, the optimal individuals in each migration group will be selected as the head geese of the new generation after each location update of the GMO algorithm. is selection strategy not only allows the head geese to carry excellent location information but also ensures the dispersion of the head geese's positions, so that the algorithm has an excellent ability to balance exploitation and exploration.
After the head geese are all replaced, the migration group radius (L) is reduced by equation (5). e purpose is to increase the density of members in the group and improve the exploration accuracy of the algorithm.
where T is the maximum number of iterations and t is the current number of iterations.

Implementation of GMO Algorithm.
e GMO algorithm is a new stochastic optimization algorithm. Multiple random positions within the solution space are chosen as initial solutions, and then all solutions are iterated and optimized continuously to find the optimal solution. e flowchart and pseudocode of the GMO algorithm are presented in Figure 4 and Algorithm 1, respectively.

Time Complexity.
In practical engineering applications, the computational efficiency and computational performance of an algorithm are equally important. e time complexity analysis method is one of the essential means to evaluate the algorithm's efficiency. is method can analyze the algorithm' complexity under the condition that the population number N and the number of iterations T remain unchanged, and the computational efficiency of the algorithm can be accurately verified. e calculation process of the GMO algorithm mainly includes three parts: population initialization O(N), the establishment of migration groups O(N * T), and synchronized flight or free foraging O(N * T). erefore, the time complexity of the GMO al- e complexity formula has no exponentiation operation and is mainly affected by the basic parameter N * T. From the above analysis, it can be seen that the GMO algorithm has a lower time complexity.

Benchmark Functions and Parameter
Setting. For a new meta-heuristic algorithm, it is necessary to test the ability in terms of exploitation and exploration through a large amount of quantitative data. In this work, the performance of the GMO algorithm is tested on 29 stable benchmark functions in the CEC2017 technical report (F2 function is deprecated in this paper because of its instability) [58]. e specific function names, variable feasible regions, and minimum values are recorded in Table 1, and the detailed function models can be obtained from [58]. In addition, 4 different types of benchmark functions are provided in this table, including unimodal, multimodal, hybrid, and  Computational Intelligence and Neuroscience 5 composition functions. e test results of these benchmark functions can infer the potential ability of the GMO algorithm to solve practical problems. In order to clearly illustrate the excellent computing performance of the GMO algorithm, the five optimization algorithms are selected as the comparison targets, including the PSO, BRO, CSO, ABC, and WOA algorithms. e common parameters of all algorithms are set as follows: the population number N � 100, the maximum number of iterations T � 500, and the dimension D � 10, 30, 50, and 100, and other related parameters are shown in Table 2. Windows 10 operating system is the processing environment for the experimental process, and the PC processor is Inter(R) Core(TM) i5-3470M CPU @3.20 GHz.

Experimental Results.
In the calculation process of the meta-heuristic algorithm, the random numbers in the solution space are generally used as the initial values. e calculation result of the algorithm may be different due to the difference in the initial values. erefore, to avoid the influence of special data on the overall results, 50 independent experiments are performed for each benchmark function, and the same initial values are used for each independent experiment. is section gives the test results data of 6 algorithms on 29 benchmark functions in different dimensions, and the experiment dimensions include D � 10, D � 30, D � 50, and D � 100. e specific experimental results are shown in Tables 3-14. Among them, the experimental results of unimodal and multimodal benchmark functions in 4 different dimensions are recorded in Tables 3, 6, 9, and 12, respectively. Similarly, the experimental results of the hybrid functions are recorded in Tables 4, 7, 10, and 13, respectively. e experimental results of the composition benchmark functions are recorded in Tables 5,8,11,and 14,respectively. In order to verify the performance of the GMO algorithm, the mean, standard deviation, and running time of each benchmark function in 50 independent experiments are selected as evaluation indicators. Among them, the mean can evaluate the computing power and accuracy of the algorithm, the standard deviation can evaluate the computational stability of the algorithm, and the running time can judge the complexity of the algorithm. In addition, in order to display the experimental results more clearly and

End for End for End if
Update the migration group range radius by equation (5). e fitness values are recalculated, and the optimal individual in each migratory group is selected as the new head goose.

End for
Recording the optimal fitness value and its individual location information data. ALGORITHM 1: e pseudocode of GMO.
e number of food sources is 50, the limit is 20   Computational Intelligence and Neuroscience 9  intuitively, each table also records the ranking of the average value and the results of the significance test. e rank of average value is determined by the numerical value of the test results. e algorithm with the smallest average value is ranked 1st, and the algorithm with the largest average value is ranked 6th and gives the same rank when the average value is the same but occupies two positions. e final overall ranking of the algorithm is determined by the average of the algorithm's ranking on all functions. e significance test technique uses statistical methods to explore whether there are significant differences in data distribution. In this paper, the significance test is performed on the 50 calculation results of the GMO algorithm and other comparison algorithms, respectively. e Wilcoxon rank-sum test or the independent sample t-test (T-test) is used to test the significance of different types of data. According to the data normality test and variance homogeneity test results, the T-test is performed for normally distributed data, and Wilcoxon rank-sum test is performed for others. e level of statistical significance is set at p � 0.05. p < 0.05 means that the calculation result of the GMO algorithm is significantly different from the comparison algorithm, which is recorded as "1" in the table. p > 0.05 means negative answer, which is recorded as "0" in the table.

Evaluation of Exploitation and Exploration Capabilities.
e unimodal functions (F1, F3) are often used to verify the exploitation ability of the algorithm because they have only one global optimal value. e multimodal functions (F4-F10) have an excellent effect on testing the exploration ability of the algorithm because of the characteristics of multiple local optima.          Tables 3, 6, 9, and 12. In the case of D � 10, the calculation results of the GMO algorithm are better than the comparison algorithms. In the case of D � 30, the test results of the GMO algorithm on 6 functions are the optimal values, and the test results on the F3, F6, and F9 functions are not the optimal values, but the results are equally competitive. In the case of D � 50 and D � 100, the test result of the GMO algorithm only on the F3 function is not the optimal value. In addition, the comprehensive ranking of the averages in Tables 3, 6, 9, and 12 is shown in Figure 5. It can be seen that the GMO algorithm has the best computation results. Meanwhile, the box plot of the convergence results obtained by 50 experiments on the F1-F10 functions (taking D � 50 as an example) is shown in Figure 6. e figure shows that the GMO algorithm maintains a leading edge in convergence accuracy and stability.
Based on the analysis results of the above data, it can be seen that the GMO algorithm proposed in this paper has good exploitation ability, exploration ability, and computational stability. is may be attributed to two points. One is that the migration group members move randomly in a small area near the head geese during the free foraging process. e other is that the individuals in each migration group keep moving synchronously during the migration process, which effectively expands the scope of exploration.

Ability to Avoid Local
Minima. F11-F20 are hybrid functions, and F21-F30 are composition functions. ese complex functions are obtained by the essential functions' combination, rotation, and offset. e common feature of the functions is that there are a large number of local extrema in the solution space, which makes the solution space            Based on the above data analysis, the GMO algorithm has the comprehensive ability to solve complex problems of different dimensions. It can well balance the contradiction between exploitation and exploration in the complex solution space, and the algorithm shows good stability. is may be attributed to alternating between synchronous migration and free foraging processes in the GMO algorithm.

Convergence Analysis.
e convergence information during the algorithm solving process can be fully displayed in the average convergence curve, which is very important to the computational power of the analysis algorithm. Taking D � 50 as an example, this paper gives the average convergence curve of 29 functions by the GMO algorithm and 5 comparison algorithms, as shown in Figure 10. From the overall results, the convergence results of the GMO algorithm are the best on 27 functions and rank second on two functions (F3, F22), which powerfully illustrate the advantage of the GMO algorithm in terms of convergence ability. From the convergence effect of a single function, the convergence speed of the GMO algorithm is slow in the early stage. However, the GMO algorithm converges fast in the middle stage and quickly converges to the global optimum.
is may be attributed to the large radius of the migration group in the early stage of the GMO algorithm. e wild geese fully explored the solution space during the synchronous migration process and stored the exploration results. With the continuous iteration of the algorithm, the range radius of the migration group is reduced, and the position of the head geese is continuously optimized, so that the algorithm converges quickly until the best convergence effect is achieved.

Analysis of Significance Test and Running Time.
In this section, the experimental results are further analyzed by statistical methods. e significance test (Wilcoxon ranksum test or T-test) results for all data tables in Section 3.2 are counted, as shown in Table 15. In the table, "1" indicates a significant difference between the two samples, and "0" means no significant difference. "+" indicates that the performance of the GMO algorithm is better than other algorithms, and "−" indicates that the performance of the GMO algorithm is worse than other algorithms. erefore, the number of "1+" in the results is counted, which can strongly demonstrate the advantages of the GMO algorithm.
From the statistical results in Table 15, it can be seen that comparing the GMO algorithm with the WOA, PSO, BRO, and CSO algorithms, there are at least 26 calculation results of "1+," and comparing the GMO algorithm with the ABC algorithm, there are at least 20 calculation results of "1+." Overall, the significance test results of the GMO algorithm compared with the other 5 algorithms can reach "1+" more than 96% of the time, which further illustrates the advantages of the GMO algorithm.
According to the data tables in Section 3.2, the running times of all algorithms are further counted, as shown in Table 16. e statistical results show that the average running time of the GMO algorithm is similar to the PSO algorithm, and it is lower than that of WOA, BRO, and CSO algorithms. In addition, the benchmark functions corresponding to the

Comparative Analysis.
In this paper, D � 30 is taken as an example, and the experimental results of GMO are compared with the data in the literature [44,45,59,60], as shown in Table 17. It can be seen from the table that the calculation results of the GMO algorithm are significantly better than those of the FSA and KABC algorithms. e performance of the GMO algorithm is similar to that of the FDB-SOS algorithm on unimodal and combinatorial functions, but the GMO algorithm performs better on multimodal functions. Compared with the FDBSFS algorithm, the calculation results of the GMO algorithm are in the same order of magnitude in most functions. is shows that the GMO algorithm is equally competitive with the improved algorithm.

GMO Algorithm for Engineering Design Problems
In order to verify the applicability of the GMO algorithm on engineering design problems, this section seeks five classical structure design problems, and the GMO algorithm is used to solve the problems. In the experimental process, the design variable is used as the individual's location information in the optimization algorithm, and the calculation model of each problem is used as the objective function. First, the structure design problems are introduced in detail. e problems mainly include three-bar truss design problem, pressure vessel design problem, tension/compression spring design problem, gear train design problem, and cantilever beam design problem. en, to prove the superiority of the GMO algorithm in solving engineering design problems, the experimental results of the GMO algorithm are compared with the corresponding results of several other algorithms. e results of other algorithms come from literature reports, including KABC [60], DMMFO [61], GOA [62], LSA [63], ALO [38], CS [31], GSA [20], IAPSO [64], CPSO [65], MABGA [66], MBA [67], SOS [68], and CBO [69] algorithms. Finally, all experimental results are analyzed and discussed.

ree-Bar Truss Design Problem.
ree-bar truss design is a classical optimization problem in mechanics [3,38], and its mechanism schematic is shown in Figure 11. e problem aims to minimize the volume of a three-bar truss structure, while satisfying the constraints of stress and loading force. e cross-sectional area (x 1 ,x 2 ) of the connecting rod is used as the optimization variable, and the optimization objective function is as follows.
In the process of optimizing variables, the design variables needs to meet the constraints of structural stress, material deflection, and buckling. e three constraint formulas are as follows. x 2 x 1 x 1 P l l L Figure 11: Schematic of three-bar truss mechanism.
Computational Intelligence and Neuroscience where P � 2KN/cm 2 , σ � 2KN/cm 2 . According to equations (6) and (7), the GMO algorithm is used to solve the three-bar truss problem, and the results are shown in Table 18. Compared with the results of other algorithms, the fitness values of GMO, ALO, and GSA algorithms are optimal, and the solution results satisfy the constraints. It shows that the GMO algorithm is feasible to solve the three-bar truss design problem. [70] proposed the pressure vessel design problem, which is to minimize the manufacturing cost under the constraints. e structure schematic is shown in Figure 12. is problem consists mainly of 4 design variables, x 1 is the shell thickness of the pressure vessel, x 2 is the thickness of the head, x 3 is the inner ring radius of the pressure vessel, and x 4 is the length of the cylindrical section. e calculation model is as follows.

Pressure Vessel Design Problem. Kannan and Kramer
where x 1 , x 2 ∈ [0, 100] and x 3 , x 4 ∈ [10, 200], in which x 1, and x 2 are integer multiples of 0.0625. According to the design specification, the constraint formulas are as follows.
e calculation results of the GMO algorithm and the other 9 algorithms for the pressure vessel design problem are shown in Table 19. e table shows that the results of the KABC, DMMOF, MABGA, and MBA algorithms do not meet the constraints of the variables, which is not desirable. However, the proposed GMO algorithm finds a design with the optimal value identical to LSA, CS, GSA, LAPSO, and CPSO algorithms and satisfies the variable constraints. erefore, the algorithm is also applicable to solve the pressure vessel design problem.

Tension/Compression Spring Design Problem.
It is an interesting problem to achieve tension/compression spring weight minimization, while satisfying specification and theoretical constraints.
is problem was described by Belegundu and Arora [71]. e structure is shown in Figure 13. e calculate model of tension/compression spring weight is as follows.
where x 1 , x 2 , x 3 are the design variables, which are wire diameter, coil diameter, and number of coils, respectively. e value ranges of the design variables are 0.05 ≤ x 1 ≤ 2, 0.25 ≤ x 2 ≤ 1.3, 2 ≤ x 3 ≤ 15, respectively. At the same time, x 3 x 4 x 1 x 3 Figure 12: Schematic of pressure vessel structure.
the problem also needs to meet the design theories, such as minimum deflection and shear stress. e specific constraint formulas are as follows.
e calculated results of the GMO algorithm for solving the tension/compression spring design problem are shown in Table 20 and compared with the results of 7 other algorithms. It can be seen that the calculated results of all variables meet the requirements of the constraints, and the calculation results of the GMO algorithm are very competitive.

Gear Train Design Problem.
e gear train design is a significant engineering design problem in mechanical transmission [72,73]. e process designs the number of teeth on each gear in the transmission system according to a reasonable transmission ratio. e gear train is shown in Figure 14. e design variables for this problem include the number of teeth of the 4 gears (x 1 , x 2 , x 3 , x 4 ). e mathematical model is as follows.  Figure 13: Schematic of the tension/compression spring.   x i x i Figure 15: Schematic of the cantilever beam structure.
e gear train design problem has a unique solution, and the elements of the solution vector must be integers. e optimization results of the GMO algorithm for the gear train design problem are the same as those of the ALO, IAPSO, and MBA algorithms, as shown in Table 21. It can be seen that the result of the GMO algorithm is optimal and feasible for the problem.

Cantilever Beam Design Problem.
e cantilever beam design problem is a common engineering problem [74], and its structural diagram is shown in Figure 15. e cantilever beam is mainly composed of 5 sections of square steel with equal wall thickness, and the design variables include the section side length of the 5 sections of square steel (x 1 , x 2 , x 3 , x 4 ,x 5 ). e design objective is the minimum weight of the cantilever beam. e calculation model is established in equations (13), and equation (14) is the constraint formula.
e experimental results of the GMO algorithm to optimize the cantilever beam design problem are shown in Table 22. e table shows that the calculation results of all algorithms satisfy the constraints and the optimal fitness values are very close. It is proven that the GMO algorithm obtains satisfactory results. e comparison results of the above five engineering design problems show that the GMO algorithm has good applicability in practical engineering problems in complex unknown spaces and has achieved satisfactory calculation results. It proves that the GMO algorithm is a promising meta-heuristic optimization algorithm.

GMO Algorithm for Inverse
Kinematics Solution is paper takes the 7R 6DOF robot as an example to study the inverse kinematics solution of robot by GMO algorithm. e 7R 6DOF robot is composed of 7 rotary joints, which are driven by 6 motors. e robot structure is shown in Figure 16. It has the characteristics of a hollow wrist and flexible movement, which can be used for work in narrow spaces and complex paths. However, the problem of no analytical solution for inverse kinematics limits the field application. erefore, it may be only feasible to study numerical methods for solving the inverse kinematics of the robot.

Kinematic
Modeling of the 7R 6DOF Robot. In this paper, the D-H parameter method is used to establish the kinematic model of the 7R 6DOF robot. e forward kinematics model is as follows.
where 0 7 T is the pose matrix of the end effector and i−1 i T is the coordinate transformation matrix between adjacent links of the robot. e specific transformation matrix is as follows.
where a i , d i , α i , and θ i represent the link length, link offset, link torsion angle, and joint angle, respectively. Among them, a i , d i , α i are the fixed parameters of the rotary joint robot, and θ i is the control parameter. is paper takes the IRB5400 robot with a 7R 6DOF structure as an example, and its D-H parameters are shown in Table 23 [75]. According to the input robot joint angles, the pose matrix of the robot end position is solved through the where n x , n y , n z , o x ,o y , o z , α x , α y , α z represent the rotational elements of the pose matrix and p x , p y , p z represent the elements of position vector. In order to realize the step-by-step optimization of the GMO algorithm in the inverse kinematics solution process, the objective function is designed in equation (17), which is the difference between the expected value and the actual value of the pose matrix.
where n ∧ , o ∧ , α ∧ , p ∧ represent the rotational and position vectors of the expected pose matrix and c is the adjustment factor.

Experiment and Result Analysis.
According to the forward kinematics model and objective function of the 7R 6DOF robot, the inverse kinematics experiment of the GMO algorithm takes the joint angle of the robot as the optimization variable and the desired end pose as the optimization goal. en, to prove the GMO algorithm's computational performance in solving the inverse kinematics of the robot, the experimental results of the GMO algorithm are compared with the WOA, PSO, BRO, CSO, and ABC algorithms. In the experiment, two pose matrices of the robot end position are randomly selected as the test points, and the pose matrix is shown in Table 24. e population size N � 100, the maximum number T � 500, and the adjustment factor c � 1.
During the experiment, in order to avoid the influence of accidental results, 50 independent experiments are conducted at each test point, and the best, worst, mean, and standard deviation of each algorithm's convergence results are recorded. e results are shown in Table 25. It can be seen from the table that the average value of the GMO algorithm has reached 1.0E − 11 on two test points, which is at least 5 orders of magnitude better than other algorithms. e   best, worst, and standard deviation values are also better than those of the other 5 algorithms. e average convergence curve is shown in Figure 17. Its shows that the GMO algorithm has fast convergence speed and high accuracy. However, the effectiveness of solving the inverse kinematics problem can be more directly verified by the independent errors of each element in the pose matrix. As shown in Table 26, the independent errors of each element in the pose matrix are calculated. It can be seen that the error of each element in the solution result by the GMO algorithm is less than 1.0E − 15, which is higher than the minimum error in other algorithms. e experimental results verify the feasibility of the GMO algorithm to solve the inverse kinematics problem.
In recent years, scholars have made a lot of valuable explorations to solve the inverse kinematics of robots through intelligent methods.
is paper counted the experimental results in the literature and compared them with the solution results of the GMO algorithm, as shown in Table 27. It can be seen that scholars have achieved more

Conclusion
In this paper, the wild geese migration optimization (GMO) algorithm is inspired by the behavior of wild geese migration. e mathematical optimization model of GMO algorithm is designed by simulating the special migration process of the wild geese, which has the advantages of simple structure and few parameters. In order to verify the optimization ability of the GMO algorithm, the 29 stable benchmark functions from CEC2017 are used for 50 experiments, respectively. e primary performance evaluation indicators are the mean, standard deviation, significance test results, and the algorithm's running time. e test results of the GMO algorithm and WOA, PSO, BRO, CSO, and ABC algorithms are statistically analyzed. It can be seen that the GMO algorithm has apparent advantages in computing performance and can better seek a balance between exploitation and exploration. It is a sufficiently competitive optimization algorithm.
In addition, the GMO algorithm is used to solve five engineering optimization problems, and the solution results are compared with the results provided in other studies. e comparison results show that the GMO algorithm obtains excellent solution results, and the experimental results meet the constraints of engineering optimization problems. is shows that the GMO algorithm has satisfactory computing performance and universality in the face of unknown space and complex practical problems. Finally, the GMO algorithm is applied to the inverse kinematic pose problem of the 7R 6DOF robot.
e experimental results show that the average solution accuracy of the end pose of the GMO algorithm reaches 1.0E − 11, which is at least 5 orders of magnitude higher than that of the comparison algorithm.
e GMO algorithm provides a new solution for the inverse kinematics of the complex 7R 6DOF robot, showing that the algorithm has strong practicability and good development prospects.
In future work, we will study the independent optimization mechanism of the migration groups in the GMO algorithm and the multiobjective optimization problem of the GMO algorithm and explore more valuable practical application cases.