Biogeography-Based Optimization with Orthogonal Crossover

Biogeography-based optimization (BBO) is a new biogeography inspired, population-based algorithm, which mainly uses migration operator to share information among solutions. Similar to crossover operator in genetic algorithm, migration operator is a probabilistic operator and only generates the vertex of a hyperrectangle defined by the emigration and immigration vectors. Therefore, the exploration ability of BBO may be limited. Orthogonal crossover operator with quantization technique (QOX) is based on orthogonal design and can generate representative solution in solution space. In this paper, a BBO variant is presented through embedding the QOX operator in BBO algorithm. Additionally, a modified migration equation is used to improve the population diversity. Several experiments are conducted on 23 benchmark functions. Experimental results show that the proposed algorithm is capable of locating the optimal or closed-to-optimal solution. Comparisons with other variants of BBO algorithms and state-of-the-art orthogonal-based evolutionary algorithms demonstrate that our proposed algorithm possesses faster global convergence rate, high-precision solution, and stronger robustness. Finally, the analysis result of the performance of QOX indicates that QOX plays a key role in the proposed algorithm.


Introduction
Many problems in both industrial application field and scientific research world can be regarded as optimization problems.During the past decades, several kinds of classical methods [1,2] have been proposed to handle optimization problem, which have made enormous progress, but these methods demand to know the property of optimization problem, such as continuity or differentiability.In recent decades, many meta heuristic algorithms sprung up, for example, genetic algorithm (GA) [3], particle swarm optimization (PSO) [4], simulated annealing algorithm (SA) [5], differential evolution (DE) [6], and biogeography-based optimization (BBO) [7].These algorithms can solve optimization problems without using some information such as differentiability.
Biogeography-based optimization (BBO) [7], developed by Simon, is a new emerging population-based evolutionary algorithm motivated through mimic migration of species in natural biogeography.Just as species migrate back and forth between islands in biogeography, migration operator in BBO shares information between habitats in the population.
That is to say, a good solution can share their features with poor ones through migration operator, and a poor solution can improve their quality by accepting new features from good ones.Thus, BBO possesses powerful exploitation ability.However, migration operator does not produce new SIV which may lead to weak exploration ability and poor population diversity.
In evolution algorithm, the step of generating some solutions can be considered as "experiment." For example, the operation of using crossover operator to sample genes from their parents and produce new offspring, this process can be considered as sample experiment.Orthogonal experiment design is a method that is used to generate multifactor experiment uniformly, which has been developed to sample a small, but representative set of combinations for experimentation.Zhang and Leung [8] introduced orthogonal experimental design into genetic algorithm and proposed orthogonal crossover (OX).Leung and Wang [9] used quantization technique in orthogonal experimental design and proposed quantization orthogonal crossover operator (QOX).Experimental studies show that QOX is an effective and efficient operator for numerical optimization.
Since BBO has properties of powerful exploitation ability, weak exploration ability, poor population diversity, and slow convergence rate simultaneous.QOX operator is a global search operator with systematic and rational search ability.In order to enhance the exploration ability and improve the population diversity, in this paper, an improved BBO variant, namely, biogeography-based optimization with orthogonal crossover (denoted as OXBBO), is presented for solving global optimization problems.In the proposed algorithm, QOX operator is embedded into BBO to enhance its exploration ability and accelerate its convergence rate, and a modified migration operator is used to improve the population diversity.
The rest of the paper is organized as follows.The basic BBO is introduced in Section 2, while Section 3 briefly reviews orthogonal experiment design.The proposed algorithm is introduced in Section 4 in detail.In Section 5, extensive experiments have been carried out to test OXBBO.Finally, some conclusions and future works are summarized in Section 6.

Biogeography-Based Optimization
BBO is a new population-based biogeography inspired global optimization algorithm, which has some features in common with other biology-based algorithms.Migration operator is the main operator of BBO which shares information among solutions.PSO and BBO solutions survive forever, while GA solutions "die" at the end of each generation.PSO solutions are more likely to clump together in similar groups, while BBO and GA solutions do not have any built-in trend to cluster.
BBO is a method motivated by geographical distribution of biological organisms.In BBO, each individual is considered as a "habitat" with a habitat suitability index (HSI).Habitats with a high HSI tend to have a large number of species, while those with a low HSI have a small number of species.Habitats with a high HIS migrate to the nearby habitats, while those with a low HSI accept features from neighbor habitats with high HSI.
Migration strategy in BBO is similar to the global recombination approach in the breeder GA and evolutionary strategy.Migration operator is a probabilistic operator that adjusts habitat based on the immigration rate and emigration rate.The probability that  is modified is proportional to its immigration rate , and the probability that the modification source comes from  is proportional to the emigration rate .Migration operator [7] can be described in Algorithm 1.
(1) Begin (2) For  = 1 to Dim (3) Use   and   to compute the probability   (4) Select variable   with probability   (5) If   is selected (6) Replace   with a randomly generated SIV ( 7) where  max is a user-defined parameter.In BBO, highly probable solutions will tend to be more dominant in the population without this modification.This mutation scheme tends to increase diversity among population.The mutation operator [7] can be loosely described in Algorithm 2.
As some scholars has pointed out that BBO may have some deficiencies.For example, BBO is good at exploiting the solution space, but BBO is weak at exploring the solution space.Recently, many researchers have been working on the improvement of BBO, and many variants of BBO are presented.
Improving the operator in BBO is one direction for improvement.Gong et al. [11] used a real code to represent the solution and extend BBO to the continuous domain optimization and then presented a real-coded BBO (RCBBO).Ma and Simon [16] modified the migration operator by giving a new migration formula and then proposed blend BBO (B-BBO).Li et al. [10] presented perturbing biogeographybased optimization (PBBO) through a design of perturbing migration based on the sinusoidal migration curve.Li and Yin [12] proposed another migration operation based on multiparent crossover called multiparent migration model and then presented multioperator biogeography-based optimization (MOBBO).Ma [17] generalized linear migration model to six different migration models and then used experimental study and theoretical analysis to investigate their performance.
Hybridization of EAs is another direction for improvement.Cai et al. [18] combined evolutionary programming with BBO to enhance the exploration ability and then proposed hybrid BBO (BBO-EP).Wang and Xu [19] combined differential evolution and simplex search with BBO and proposed SSBBODE for solving parameter estimation problems.Lohokare et al. [20] presented intelligent biogeography-based optimization (IBBO) through hybridization of BBO with bacterial foraging algorithm.Zhu [21] used graphics hardware acceleration to present a massively parallel biogeographybased optimization-pattern search algorithm (BBO-PS).Tan and Guo [22] proposed quantum and biogeography-based optimization (QBBO) by evolving multiple quantum probability models via evolutionary strategies inspired by the mathematics of biogeography.Gong et al. [15] combined the exploration of DE with the exploitation of BBO effectively and then presented a hybrid DE with BBO, namely, DE/BBO.Boussaǐd et al. [13] proposed a two stages algorithm (DBBO) through hybridization of BBO with DE, which updated the population by using BBO and DE alternately.
Moreover, theory analysis of BBO and application has also been developing.Simon [23] introduced several simplified versions of BBO and used probability theory to perform an approximate analysis.Simon et al. [24] showed that BBO is a generalization of a genetic algorithm with global uniform recombination (GA/GUR) and compared BBO and GA/GUR algorithms by using analytical Markov models.Simon [25] also introduced dynamic system models for BBO and GA/GUR.Ma et al. [26] incorporated resampling in BBO to solve optimization problems in noisy environments.Ergezer et al. [27] employed opposition-based learning alongside migration rate of BBO and created oppositional BBO (OBBO).
It is necessary to emphasize that our work is to add a new global search operator into BBO algorithm.Meanwhile, we also aim at not increasing too many control parameters in the algorithm.

Orthogonal Crossover
In practical application, it is impossible to consider all combinations in large sample space, and then we can use a small part of representative samples to represent the sample space.Orthogonal design method [9], with both orthogonal array (OA) and factor analysis (FA), is used to execute multifactor experiment uniformly and sample welldistributed in solution space OA is a fractional factorial array of numbers arranged in rows and columns which can assure a balanced comparison of levels of any factor.All columns in orthogonal array can be evaluated independently of one another, a number of such arrays can be found in http://www2.research.att.com/∼njas/oadir/.
For example, an experiment with 4 factors and 3 levels, its orthogonal array is  9 (3 4 ).There are 81 combinations for all.If we apply the orthogonal array, we only consider nine combinations.
For convenience, we denote an orthogonal array for  factors with  levels and  combinations as   (  ) = [  ] × , where   ∈ {1, 2, . . ., } represents the th factor in the th combination.Generally, Orthogonal crossover operator was first developed by Zhang and Leung [8], which originated from the integration of orthogonal experiment design with crossover operator to generate several new solutions in the line with orthogonal array.As a matter of fact, each operation of generating a new offspring can be regarded as an experiment.For example, a crossover is a procedure for sampling several points from a defined region while orthogonal crossover operator (OX) uses orthogonal array to make crossover operator with more statistically.OX in [8] was used in combination problems.Leung and Wang [9] QOX quantizes the solution space [ low ,  up ] into  levels: In practical application, dimension  is often much larger than orthogonal factor , and   (  ) cannot be used directly.So, we first randomly generate −1 different integers between 1 and and then divide -dimension solution vector into  factors: Finally,  combinations are generated according to orthogonal array   (  ) whose habitat suitability index is evaluated, and the two best habitats are selected from them to replace their parent habitats.

The Proposed Algorithm: OXBBO
4.1.Orthogonal Crossover Operator.Migration operator plays a key role in BBO.Simon et al. [24] has pointed out that BBO migration strategy is conceptually similar to the GA global uniform recombination operator, which can be considered as a special GA crossover operator [24].Similar to GA crossover operator, migration operator only generates and evaluates the habitat which is located in the vertex of hyperrectangle defined by the emigration habitats and immigration habitats.The inner and other vertexes of the hyperrectangle are not considered in this process, which might be a promising region in the solution space.This process may lead the optimal solution to fall into local optimal solution, especially, in the initial optimizing stage.That is to say, migration operator cannot perform a systematic search in the hyperrectangle.Therefore, the exploration ability of BBO could be limited to some extent.Orthogonal crossover operator with quantization technique (QOX) samples a small but representative set of combinations for experimentation.In the searching process, QOX not only searches the vertexes of hyperrectangle defined by parents but also searches the inner of the hyperrectangle.So QOX could execute systematic search in the hyperrectangle, which can locate the global optimal solution easy and overcome the limitation of migration operator.
As shown in Figure 1,  1 ( 1 ,  1 ) and  2 ( 2 ,  2 ) are the parents (supposing that one is emigration habitat and the other one is immigration habitat in two-dimensional solution space).We can see in Figure 1 that GA crossover operator generates two offspring randomly.Migration operator absorbs better SIVs and generates one offspring, determined by immigration and emigration rate, in the vertex consciously.And orthogonal crossover operator with  9 (3 4 ) generates nine potential offspring which lie in both the vertexes and the inner of hyperrectangle uniformly.Obviously, orthogonal crossover operator searches the solution space more thoroughly than GA crossover operator and migration operator do.
In order to enhance the exploration ability of BBO, we embedded the QOX operator with  9 (3 4 ) into BBO.For the sake of randomness, two random selected habitats are used as the parent of QOX, which will be replaced by two best habitats chosen from offspring after performing QOX.Note that QOX operator is more costly than migration operator and GA operator, since it needs to evaluate 9 offspring if  9 (3 4 ) is used.Considering the limitation of the number of fitness function evaluations, we apply QOX with  9 (3 4 ) only once at each generation to save the computation cost and to simplify the implementation procedure.The brief description of QOX is shown in Algorithm 3.

Modified Migration Operator.
As analyzed before, in migration operator, the SIV of new habitat migrates from other habitats.This procedure only uses the SIV of better (1) Begin (2) Generate initial population Pop randomly (3) Evaluate each habitat   (4) While the halting criteria are not satisfied do (5) Sort habitats from best to worst based on the fitness values (6) Map the fitness value to the number of spices (7) Calculate the emigration rate and immigration rate of each habitat (8) For  = 1 to NP / * migration stage * / (9) Select   based on immigration rate   (10) Generate two different integers 1 ̸ = 2 randomly between 1 and NP (11) If   is selected (12) For  = 1 to NP (13) Select   based on emigration rate   (14) If rand(0, 1) <   (15) End If (17) EndFor ( habitats to replace the SIV of the worse habitats but does not produce new SIV which results in poor population diversity, especially in the later stage.For the sake of this, a new modified migration is used to improve the population diversity.In Algorithm 1, formula ( * ) is replaced by a new formula: where 1, 2 are two different integers generated randomly between 1 and NP (NP is population size).We can see in formula (6), the difference generated by two randomly selected habitats are added to the emigration SIV of habitat, which can generate a random perturbation on the emigration SIV and then improve the diversity of new population.

Mutation with Mutation
Operator.In order to enhance the exploration ability of BBO, a modified mutation operator, Cauchy mutation operator, is integrated into BBO to replace the randomly generated SIV.
The formula for the probability density function of the Cauchy distribution [11] is where  ∈ , and  > 0 is a scale parameter.A real-valued random variable  is Cauchy distributed with  > 0 being written as The Cauchy mutation ( = 1) in OXBBO can be described as 4.4.Boundary Constraints.The OXBBO algorithm assumes that all the habitats in the population should be limited in an isolated and finite solution space.There are some habitats moving out of the solution space, in the optimizing process, which should be prevented.In order to keep the solution of bound-constrained problems feasible, those habitats that violate boundary constraints should be replaced by a new generate habitat.The formula is where   ,   are the lower and upper bound, of the solution space, respectively.

Selection Operator.
Selection operator can impel the metabolism of population and retain good habitats.Greedy selection operator is used in this paper.In other words, the population habitat will be replaced by its corresponding trial habitat if the HSI of the trial habitat is better than that of its population habitat: where   ,   are the population habitat and trial habitat, respectively.
4.6.Main Procedure of OXBBO.As the analysis above, migration operator is good at exploiting the solution space and weak at exploraing the solution space.QOX is good at exploring the solution space, which can offset the deficiency of BBO.Modified migration operator can increase the difference of population and improve the population diversity.Therefore, we introduce biogeography-based optimization with orthogonal crossover (viz., OXBBO) by incorporating the above-mentioned QOX and modified migration operator into BBO in this section.The pseudocode of OXBBO approach is described in Algorithm 4.

Experimental Study
In order to verify the performance of the proposed algorithm, several experiments are implemented on 23 benchmark functions.These benchmark functions are chosen from [29] and briefly summarized in Table 1, detailed descriptions about these functions can be found in [29].
Functions  1 - 5 are unimodal functions.Where function  5 has a very sharp narrow ridge running around a parabola.Function  6 is a discontinuous step function with one minimum.Function  7 is a low-dimensional function with a noisy perturbation.Functions  8 - 13 are multimodal functions where the number of local minima increases exponentially with dimension.Functions  14 - 23 are lowdimensional functions with only a few local minima.

Experimental Setting.
For OXBBO, we have chosen a reasonable set of values.For all experimental tests, we use the following parameters setting unless a change is mentioned: (1) population size: NP = 100; (2) maximum immigration rate:  = 1; (3) maximum emigration rate:  = 1; (4) mutation probability:  max = 0.005; (5) value to reach (VTR) = 10 All the algorithms are evaluated on performance criteria [14,15] or similar to those.(1) Successful rate (SR) is The ratio of the number of successful runs to total runs.
(2) NEFFs the number of fitness function evaluations is needed when reach VTR.(3) Acceleration rate (AR): AR = NEFFs other /NEFFs OXBBO .AR is used to compare the convergence speed between OXBBO and other algorithms.
Note.For convenience's sake of comparison, the best function values and their standard variation are compared in Section 5.2, and the best function error values and their corresponding standard variation are compared in other sections.Moreover, high-dimensional functions  1 - 13 with dimension 10, 50, 100, and 200 are optimized in Section 5.5.Functions  1 - 13 with 30 dimensions are optimized in other sections.
All experimental tests in this paper are implemented on a computer with 1.86 GHz Inter-core Processor, 1 GB of RAM, and Windows XP operating system in Matlab software 7.6.

Comparison with Improved BBO Algorithms.
In order to validate the performance of OXBBO, we first compare OXBBO with BBO [11] and improved BBO algorithms: Perturb BBO with Gaussian mutation (PBBO-G) [10], real code BBO with Gaussian mutation (RCBBO-G) [11], and Multioperator BBO (MOBBO) [12].OXBBO executed 50 independent runs on 23 benchmark functions; mean fitness values and their standard variations are recoded.Results of other algorithms are taken from corresponding papers directly.Comparison results are summarized in Table 2.
Compared with PBBO-G, from Table 2, we can see that OXBBO is better than PBBO-G on 21 out of 23 functions.For unimodal function  1 - 6 , it is obvious that OXBBO performs better than or similar to PBBO-G on 5 out of 6 functions, and PBBO-G is better than OXBBO only on function  5 .For higher-dimensional multimodal functions, OXBBO performs better than PBBO-G for all the multimodal functions.For lower dimension functions, OXBBO outperforms PBBO-G except function  17 , and both algorithms obtain similar results on function  17 .Moreover, the results of OXBBO are better than those of PBBO-G by several orders of magnitude on functions  1 ,  2 ,  4 ,  8 ,  9 ,  10 ,  11 ,  12 ,  13 ,  14 ,  16 ,  18 , and  19 .The analysis shows that the combination of QOX and modified migration operator into BBO is much better than combination of perturbing migration and sinusoidal migration curve in BBO.
What is more is that similar results are obtained when we compare OXBBO with RCBBO-G.OXBBO outperforms RCBBO-G in 22 test functions, and OXBBO, RCBBO-G obtain the same results on test function  6 .When compared with MOBBO, OXBBO is better than or similar to MOBBO in 12 functions out of 23 functions.According to the parameters setting, MOBBO and OXBBO are tested with population size (65 * 3=) 195 and 100; both of the two algorithms are tested with the same maximum generations.That is to say, the maximum number of fitness function evaluations of MOBBO is almost twice as that of OXBBO.Meanwhile, MOBBO is better than OXBBO in precision in some functions that excellent results are also obtained by OXBBO.The above analysis indicates that OXBBO not only has better exploitation ability but also has better exploration ability.

Comparison with Hybrid BBO Algorithms.
In order to verify the performance of OXBBO further, comparisons of OXBBO with hybrid BBO algorithms, such as BBO-EP [11], DBBO [13], eBBO [14], and DE/BBO [15], are made.The algorithms we adopt in this section to conduct the comparison are to directly cite their numerical results available in the according papers.OXBBO are executed 50 independent runs on 13 high-dimensional functions.The numerical results of 13 high-dimensional benchmark functions are listed in Table 3.

Convergence Speed When
Compared with BBO Algorithms.From Section 5.3, we can see that, though OXBBO outperforms four algorithms, the results of eBBO, DE/BBO are also very competitive.According to [30], more than three quarters of the computational cost is consumed by the fitness function evaluation in evolutionary search.Hence, in solving real-world problems, NFFEs overwhelm the algorithm overhead.For these reasons, we compare the convergence speed of OXBBO with eBBO and DE/BBO by using the concept acceleration rate (AR).From the definition of AR, we know that AR > 1 means that OXBBO is faster than the compared algorithm, and vice versa.
Table 4 summarizes the mean NFFEs and SR of eBBO, DE/BBO, and OXBBO for solving 13 test functions, and ARs between algorithm OXBBO and eBBO, OXBBO, and DE/BBO.
OXBBO requires fewer NFFEs to reach the VTR than DE/BBO does on 10 functions out of 11 functions with a successful run, which can indicate that our algorithm is faster than DE/BBO.Similar results can be obtained when compared with eBBO.For example, tests on function  1 show that the average numbers of NFFEs of 56110, 59926, and 44398 are needed by the eBBO, DE/BBO, and OXBBO, respectively, to reach the VTR.However, OXBBO only needs 44398 NFFEs, which means that its CPU time is the shortest among the three algorithms.Moreover, it can be seen from Table 4 that the average AR between algorithms OXBBO and eBBO, OXBBO, and DE/BBO in the successful run are 1.26 and 1.37 (the total AR divided by 11); that is to say, the total Mathematical Problems in Engineering   convergence rate of OXBBO is faster than that of DE/BBO and eBBO.
From Table 4, we can also find that the average SR of eBBO, DE/BBO, and OXBBO are 76.8%, 84.6%, and 84.6%, respectively.OXBBO and DE/BBO reach the VTR with a successful ratio of 100% on 11 functions except function  3 and  5 , while eBBO on 9 functions.So, OXBBO and DE/BBO are more robust than eBBO.

Effect of Dimensionality on the Performance.
From the analysis above, OXBBO is a BBO algorithm with robust effective performance.In order to investigate the influence of scalability on the performance of OXBBO, we carry out a scalability study comparison with DE/BBO for the scalable functions.For functions  1 - 13 , Dim = 10, 50, 100, and 200.The results of 50 independent runs are recorded in Table 5 after Dim × 10000 NFFEs.
We can find, from Table 5, that the overall SR is decreasing for both DE/BBO and OXBBO upon increasing the dimension.For OXBBO, the average SR for solving function with dimension 10, 50, 100, and 200 is 92%, 77%, 69%, and 67%, respectively, whereas the SR of DE/BBO for solving these functions with the same dimension is 92%, 77%, 72%, and 57%, respectively.Obviously, the average SR of DE/BBO decreases more than that of OXBBO does upon increasing dimension, which can also be proved when we carefully compare the mean fitness values accordingly.By carefully looking at the results, we can recognize that for unimodal function  1 , DE/BBO is better than OXBBO at dimensions 10 and 50.However, DE/BBO is outperformed by OXBBO at high dimensions (100 and 200).Moreover, the precision of  1 obtained by DE/BBO is decreased very much upon increasing dimension, while that obtained by OXBBO is increased upon increasing dimension.Another example, for multimodal function  8 , which is very difficult to locate its global optima position, OXBBO can find its global solution at different dimensions in Table 5, but DE/BBO cannot do it.The experimental test shows that OXBBO can also solve higher dimensionality function effectively.Therefore, we can know that the operators used in OXBBO can enhance the exploration ability but do not offset its exploitation ability.

Comparison with Other OX-Based Algorithms.
In this section, we compare the performance of OXBBO with two excellent OX-based algorithms: OXDE [28], which is proposed by embedding QOX into DE, and OLPSO [4], which is presented through using an orthogonal learning strategy for PSO to overcome the "oscillation" phenomenon of traditional PSO.Now, we apply OX-based algorithms to solve 13 test functions adopted in this paper under 25 independent runs.In order to compare the performance fairly, all control parameters are kept the same with their corresponding paper except the population size.The population size is 50 in this section.Results for OX-based algorithms with 25 independent runs are summarized in Table 6.The convergence curves and boxplot figures are listed in Figures 2 and 3 separately.
From Table 6, we can find that the results of OXBBO are significantly better than those of OXDE, OLPSO in 12, and 10 out of 23 functions, while this number for OXDE, OLPSO are 3, 3 out of 23 functions.In terms of the success rate, the average SR of OXDE, OLPSO, and OXBBO is 74%, 71%, and 82%, respectively.By carefully looking at the results in Table 6 From Figures 2 and 3, we can see that OXBBO is not only converging faster than both OXDE and OLPSO, but also converging more robust than both two.From the analysis, we can conclude that the overall performance of OXBBO is better than OXDE and OLPSO.5.7.Analysis the Performance of QOX.From the analysis above, we know that the proposed algorithm OXBBO possesses very good performance through comparison with improved BBO algorithms, hybrid BBO algorithms, and some state-of-the-art orthogonal-based algorithms.Does QOX really play a key role of OXBBO?For these reasons, in this section, a detailed analysis is made to analyze the performance of QOX operator in BBO.We consider two variants of OXBBO: OXBBO with QOX operator, OXBBO without QOX operator.These algorithms are denoted as OXBBO, OXBBO1.The experimental test performs on 23 test functions with 25 independent runs.The some parameter setting suggested in Section 5.1.The mean and standard deviation of the function error values and t-test values of 25 independent runs for each algorithm have been listed in Table 7.The convergence curves and boxplot figures are listed in Figures 4 and 5.
According to the t-test, the results of OXBBO are significantly better than those of OXBBO1 in 11 out of 23 test functions, and the results of OXBBO are similar to those of OXBBO1 in 11 out of 23 functions.OXBBO1 surpasses OXBBO on only one function.From this, we know that QOX has played a very important role in enhancing the performance of OXBBO, which not only improves the solution precision but also enhances the reliability.This can also be demonstrated in Figures 4 and 5.

Conclusions and Future Work
In this paper, a new improved BBO algorithm (OXBBO) is present through the combination of the QOX operator     into BBO algorithm, which is designed to overcome the shortcoming that migration operator only visits one vertex of the hyperrectangle defined by the immigration and emigration vectors.In OXBBO, QOX is able to make a systematic and rational search in the region and enhance the exploration ability.Moreover, a modified migration operator is used in OXBBO to improve the population diversity.In this paper, extensive experiments have been implemented to compare the performance of OXBBO with state-of-the-art BBO algorithms and orthogonal crossover-based evolutionary algorithms.We have also experimentally studied the effect of QOX on the performance of OXBBO.
We have observed that the results obtained by OXBBO differ very much with different population sizes, which maybe due to the search frequency of QOX, similar results in [28].In the future, we will study the effect of the search frequency of QOX on the performance in BBO framework for large scale dimension test functions.Furthermore, we would like to point out that experimental tests are based on the no-rotation functions; studying the performance on rotation functions is also our future work.

Table 1 :
Benchmark functions used in our experimental tests.

Table 3 :
Comparison of hybrid BBO algorithms and OXBBO for high-dimension of functions.

Table 4 :
NFFEs required to obtain accuracy levels less than VTR.

Table 5 :
Comparison of DE/BBO and OXBBO with differential dimensions.

Table 6 :
Comparison of OX algorithms.The value of  with 48 degrees of freedom is significant at 0.05 confidence level by two-tailed test.
−The corresponding algorithm is better than our proposed OXBBO method.

Table 7 :
Comparison of OX operator.