Fruit Fly Optimization Algorithm Based on Single-Gene Mutation for High-Dimensional Unconstrained Optimization Problems

,e fruit fly optimization (FFO) algorithm is a new swarm intelligence optimization algorithm. In this study, an adaptive FFO algorithm based on single-gene mutation, named AFFOSM, is designed to aim at inefficiency under all-gene mutation mode when solving the high-dimensional optimization problems. ,e use of a few adaptive strategies is core to the AFFOSM algorithm, including any given population size, mutation modes chosen by a predefined probability, and variation extents changed with the optimization progress. At first, an offspring individual is reproduced from historical best fruit fly individual, namely, elite reproduction mechanism. And then either uniform mutation or Gauss mutation happens by a predefined probability in a randomly selected gene. Variation extent is dynamically changed with the optimization progress. ,e simulation results show that AFFOSM algorithm has a better accuracy of convergence and capability of global search than the ESSMER algorithm and several improved versions of the FFO algorithm.

Fruit fly is an insect which eats plants that are decaying, especially fruits. Fruit flies acquire chemical information in their environment through smell and taste receptors on the surface of their bodies and then regulate behaviors, such as foraging, aggregation, mating, and spawning. In these processes, olfactory plays an important role over long distances and shorter ranges.
Fruit flies have olfactory and visual abilities superior to other species. When foraging, fruit flies first use their own olfactory organs to smell the odor from food source and exchange odor information to the surrounding fruit flies, a process known as the olfactory foraging phase. en, the flies used their visual organs to find and fly to the locations of the flies that had gathered the best odor information, a process named the vision foraging phase.
A series of studies show that some unreasonable algorithmic design makes FFO algorithm ill-equipped to jump out of local extremum and to handle complex, high-dimensional, and nonlinear problems. So with this as the starting point of the paper, a small population, adaptive and improved version of the FFO algorithm, named AFFOSM, is developed based on the single-gene mutation mode, in which the only one gene of an offspring is different from the elite individual.
On the contrary to a single-gene mutation mode, allgene mutation mode is adopted by most of optimization algorithms, such as FFO, PSO, and ABC algorithm, in which each gene of an offspring is different from the elite or parent individual. e efficiency of the AFFOSM algorithm presented in this research is evaluated by solving 6 test problems. Optimization results demonstrate that AFFOSM is very competitive compared to the state-of-the-art single-gene optimization methods. e rest of the paper is structured as follows. Section 2 describes the related research work about the further analysis and modification to FFO Algorithm. Section 3 describes the developments of FFO algorithm, from all-gene mutation to single-gene mutation. AFFOSM algorithm developed in this study will be covered in detail in Section 4. Test problems and optimization results are presented and discussed in Sections 5. Section 6 summarizes the main findings of this study and suggests directions for future.

Related Work
It is worth noting that, in the FFO algorithm, fruit fly individual is represented in its coordinates in a 2D plane, and the corresponding variable value is calculated as the reciprocal of the Euclidean distance between individual and ordinate origin, as illustrated in Figure 1.
Another noteworthy thing about the FFO algorithm is elitist reproduction strategy. Once a historical best solution is found, fruit fly individuals will fly to it and look for the food resources around it before a newly best solution is found.
ere are also some disadvantages as follows: (1) e ability to solve the problem whose theoretical optimal solution is negative is not available. (2) It is difficult and time-consuming for population initialization when definition domain is far away from ordinate origin. (3) Obviously, it is not good choice that search range is fixed, compared to dynamic one. (4) Most of searching behavior happens around ordinate origin due to nonuniform search in definition domain. FFO algorithm is workable to deal with a class of problems, such as quite a lot of test function whose theoretical optimal solution is very close to zero and is poor for most of the optimization problems in practical projects.
(5) Elitist reproduction strategy could make fruit fly swarm easy to fall into local extremum and not capable to solve complex, high-dimensional and nonlinear problems.
Given the abovementioned facts, many improvements have been made in recent years.
Fu-qiang Xu and Tao [53] presented the G-FFO algorithm with sign processing in a random manner. Inspired by probability estimation for code words in adaptive arithmetic coding, a FFO algorithm with adaptive sign processing (FFOASP) is proposed [54].
Wu Lei et al. propose SEDI-FFO algorithm in which more fruit flies would fly in the search direction that was best for finding the optimal solution or at least in a direction close to the optimal direction [34].
Based on hybrid location information exchange mechanism, HFFO algorithm is proposed that enables flies to communicate with each other and conduct local search in a swarm-based approach [36].
Fan et al. propose WFFO algorithm in which an effective whale-inspired hunting strategy is introduced to replace the random search plan of the original FFO [38].
Niu et al. propose an improved FFO algorithm based on differential evolution (DFFO) by modifying the expression of the smell concentration judgment value and by introducing a differential vector to replace the stochastic search [52].
CEFFO algorithm is proposed in which trend search is applied to enhance the local searching capability of fruit fly swarm, and coevolution mechanism is employed to avoid the premature convergence and improve the ability of global searching [40].
SCA_FFO algorithm [41] is developed by introducing the logic of the sine-cosine algorithm. e fruit fly individual adopts the way to fly outward or inward to find the global optimum.

FFO: From All-Gene to Single-Gene Mutation
In general, the nonconstrained optimization problem can be formulated as an n-dimensional minimization problem as follows: Figure 1: Foraging process of fruit fly.
where f is a boundary objective function, x � (x 1 , x 2 , . . . , x n ) is the set of decision variables, n is the dimensionality, and l j and u j are the lower and upper bounds of the decision variable x j , respectively.

FFO Algorithm.
In FFO, fruit fly individual is represented in its coordinates in a plane and generated in uniform mutation around historically best solution, also called current population location: where (X i , Y i ), (X axis, Y axis)is the coordinate pair of current individual x i and current population δ,σ is the amplitude of uniform mutation, and rand is the uniformly distributed random numbers between 0 and 1. e value of x i is the reciprocal of the Euclidean distance between fruit fly individual and ordinate origin: It can be found that the mechanism of individual representing limits the performance of FFO.

3.2.
LGMS-FFO Algorithm. Shan et al. [55] use one-dimensional coordinate of fruit fly to denote the individual location, and then, let it be equal to the value of x i , which can be formulated as follows: Based on a new linear generation mechanism of candidate solution, LGMS-FFO algorithm is proposed: where ω is a weight factor to tune variation extent, ω 0 is the initial weight, α is the weight coefficient, and Iter is the current generation.
Obviously, all-gene mutation mode is used to generate offspring individuals in FFO and LGMS-FFO algorithms. However, the higher dimensionality the functions to be optimized are, the lower the probability of excellent individuals is. is in turn has caused a low convergence rate for solving high-dimensional functions.

IFFO Algorithm. Different from FFO and
LGMS-FFO, single-gene mutation mode is introduced that only one gene is selected randomly to mutate in the IFFO algorithm. It is demonstrated that the single-gene mutation mode is a better choice in performance than the all-gene mutation mode for solving high-dimensional functions.
IFFO algorithm is presented in which a control parameter σ is used to tune self-adaptively the search scope in a random direction of current swarm location, and offspring individuals are generated in the single-gene uniform mutation mode [28]: where σ max is the maximum radius, σ min is the minimum radius, and d is a random integer between 1 and n.

AFFOSM Algorithm
ESSMER algorithm, an improved evolutionary strategy with single-gene mutation and elite reproduction, is presented for solving high-dimensional function [56]. In the ESSMER algorithm, the best father individual is chosen to generate λ + k offsprings, λ by Gauss mutation, and k by uniform mutation: x i,j � δ j , i � 1, 2, . . . , k, j � 1, 2, . . . , n, j ≠ d, where d is a random integer between 1 and n. σ t is a variation extent related to the optimization process, and its initial value σ 0 is equal to 2. σ t is reduced by a quarter once the stagnant iterations (recorded as flag) reach a default iteration. Inspired by ESSMER, AFFOSM algorithm, an adaptive FFO algorithm based on single-gene mutation, is designed in this article. At first, an offspring individual is reproduced from historical best father individual. And a randomly selected gene is modified by either uniform mutation or Gauss mutation occurring by a predefined probability. Compared with ESSMER, the initial variation extents are entirely dependent on the problems to be optimized. e initial amplitude of uniform variation is equal to the width of the definition domain, a very broad range. e initial amplitude of Gauss mutation is equal to a tenth of the definition domain, a relatively little range.
In the process of optimization, the amplitude σ f of uniform variation is cyclically adjusted. At the beginning of each iteration, σ f is reduced by a quarter. σ f will be reverted to the initial value until a better solution is achieved or a predefined accuracy is reached. e amplitude σ s of Gauss mutation is dynamically changed with the optimization progress. ose continuous Mathematical Problems in Engineering 3 stagnant iterations are recorded by a variable namedflag. e amplitude σ s remains unchanged as long as flag is less than a predefined value, such as 30. Otherwise, the amplitude σ s is reduced successively by a quarter unless flag is not less than 30. Meanwhile, a variable, written as σ temp , is used to save and restore the new amplitude σ s once a better solution is found. AFFOSM algorithm mainly includes 3 steps as follows: Step 1. Initialization.
Update the mutation amplitudes. Before a new iteration, the mutation amplitudes are updated as follows: Where the variable σ temp is used to guarantee the continuity of the Gauss mutation amplitude.

Generate new solutions
For each individual, the uniform mutation is executed with probability 0.2: e Gauss mutation is executed with probability 0.8: where d is a random integer between 1 and n.
Evaluate each new solution. If a better solution is discovered, then update δ and set flag, σ f , and σ s to be 0, σ max , and σ temp . Otherwise, let flag equal to flag + 1 and return Step 2.
e computational complexity of AFFOSM is related to the size of populations N and the number of iterations iter max . e computational complexity of one fruit fly at each iteration is one, and then, the computational complexity of AFFOSM can be summarized as O(N × iter max ), which is the same as original FFO, IFFO, and ESSMER.

Test Functions and Results Analysis
To verify the proposed AFFOSM algorithm, a total of 6 benchmark problems with different characteristics are listed in Table 1 where n denotes the dimensionality of the functions and f(x * ) is the global optimal.
As another improved version of FFO algorithm, SFFO [57] adjusts adaptively its search along an appropriate decision variable from its previous experience in generating promising solutions.

Function
Expression Functions f1 and f2 are unimodal. Function f1 is the Rosenbrock function, also referred to as the valley or banana function. Its global minimum lies in a narrow, parabolic valley. Function f2 is the sphere function, also referred to as the harmonic function with the only global minimum. Functions f3-f6 are, respectively, Ackley, Alpine, Griewank, and Rastrigin function. ey are multimodal, and each of them has a large number of local minima and is difficult to be optimized. In the ESSMER algorithm, population size N � 10, λ � 8, k � 2, and σ 0 � 2. For IFFO and SFFO algorithms, N � 3 and σ max � (u − l)/2. For the AFFOSM algorithm, N � 3, σ max � u − l, and σ min � 10 −7 for all algorithms. ese algorithms are coded on Matlab 7.1 and run on Windows 10 operating system and Intel(R) Core(TM) i7-6600U CPU @ 2.60 GHz 2.81 GHz with 8G RAM.
Each problem is run 25 times independently. e median/standard deviations over these 25 runs and the average rank of four algorithms on Friedman's test [58] are reported in Table 2. Comparisons of convergence curves on n � 100 are also presented in Figure 2.
Comparing with the ESSMER algorithm, the AFFOSM algorithm performs better for functions f1-f5 with both  Mathematical Problems in Engineering n � 100 and n � 200. AFFOSM algorithm produces a smaller median; the 0, 29, 5, 7, and 2 orders of magnitude reduced with n � 100 and the 0, 15, 7, 8, and 2 orders with n � 200, respectively.
Comparing with the IFFO algorithm on n � 100, the AFFOSM algorithm performs better for all test functions. e smaller median is gained with the 0, 27, 4, 7, 0, and 6 orders of magnitude reduced. Comparing with the IFFO algorithm on n � 200, the AFFOSM algorithm performs better for functions f1, f2, f3, and f4. e smaller median is gained with the 0, 7, 4, and 5 orders of magnitude reduced. Meanwhile, the same median value is gained for function f5.
Comparing with the SFFO algorithm on n � 100, the AFFOSM algorithm performs better for all test functions. e smaller median is gained with the 0, 27, 4, 7, 1, and 6 orders of magnitude reduced.
For test functions f1, f2, f3, f4, and f5, AFFOSM performs best on both n � 100 and n � 200. ESSMER algorithm, meanwhile, performs the best only for function f6 on both n � 100 and n � 200. e IFFO algorithm performs the best only in the case of function f5 on n � 200. Unfortunately, the SFFO algorithm is the best one in no case.
Similarly, the average rank R on Friedman's test shows the proposed AFFOSM scheme is superior to other ones based on single-gene mutation.

Conclusion and Future Work
For high-dimensional problems, the algorithms based on single-gene mutation have better performance on convergency and accuracy than those based on all-gene mutation. To overcome the shortage of the FFO to solve the highdimensional optimization problems, single-gene mutation and adaptive mutation range control technique are introduced into the AFFOSM algorithm proposed in this article. In the AFFOSM algorithm, the main function of Gauss mutation is to search locally around the historic best solution. Uniform mutation not only plays a role to search globally in the whole space but also improves the efficiency of local search when uniform mutation range is significantly smaller than the Gauss mutation range. Unrelated to iterations, mutation range is adjusted by the reference of the optimization progression. Simulations show that the AFFOSM algorithm is better than those based on singlegene mutation, such as ESSMER, IFFO, and SFFO.
An interesting topic for future research would be investigating the real impact of some parameters values in AFFOSM to analyze their contributions to the algorithm performance.
Another interesting topic for future research would be applying the AFFOSM algorithm to solve constraint or multiobjective optimization problems and practical engineering problems such as benchmark structural optimization problems.
Data Availability e data and code used to support the findings of this study are available from the corresponding author upon request.

Conflicts of Interest
e authors declare that there are no conflicts of interest regarding the publication of this paper.