Combining Interval Branch and Bound and Stochastic Search

and Applied Analysis 3 (1) Put the domain into list. (2) repeat (3) Choose a working box, called V, and bisect it into two subboxes V 1 and V 2 and put them on the list. (4) Delete V from the list. (5) Discard the box in the list if it has no solution. (6) until (the termination criteria hold) Algorithm 1: Interval branch and bound algorithm. in a local minimum. There are many proposed selection techniques and the study of their convergence. Some of the methods are, for example, elitist selection (the best individuals of each generation must be selected), the proportional selection (the better fit individuals have higher probability to be selected), the ranking selection (the individuals are ranked according to their fitness and the selection is based on this ranking), and the tournament selection (individuals are divided into subgroups and the members of each group compete against each other, and then only one is selected to be in the new generation). The mutation rate is set up such that μ(t) converges to 0 when t is increased. The termination condition can be set to be themaximumnumber of generations or the unchanged of the best value over a given number of generations. The comparison of performance of the selection schemes including the convergence can be seen in, for example, [5–7]. The disadvantage of GA is that there is no guarantee for finding optimal solutions within a finite amount of time and the tuning of parameters such as population size or mutation rate are sometimes based on trial and error. However, there are no requirements on differentiability and continuity. 2.4. The Modifications. The previous described algorithms can be modified for an improvement. Some of them that related to our works are given next. For the interval branch and bound algorithms, the adjustment can be made to the following aspects for better solutions: (i) an inclusion function: the kite inclusion function [8], (ii) the subdivision of domain: multisection [8–10], (iii) the selection of a box for further process: the box with the largest rejection index such as the one defined in Casado et al. [11], pf∗ (Y) = f∗ − lb (F (Y)) ub (F (Y)) − lb (F (Y)) , (2) where pf∗(Y) represents the rejection index of a box Y. The studies for the better accuracy of solutions and the speed of algorithmwhen using an interval branch and bound to solve constrained problems are, for example, as follows. (i) Casado et al. [11] define a rejection index of a box Y as in (2) to identify a good candidate box to contain a global minimum. (ii) Lagouanelle and Soubry [8] use a new inclusion function called kite enclosure. (iii) Sun and Johnson [12] introduce local sampling strategies to the working box. The convergence proof is presented along with the numerical results. (iv) Karmakar andBhunia [13] demonstrate how to obtain one of the solutions by partitioning the accepted box (start with the search domain) into 2m subboxes where m is a number that each edge is divided. Calculate the function values of each subbox and then use interval order relations to choose a new accepted box. There are many approaches in handling constraints for GA. The classifications can be seen in [3, 14]. We describe those that combine interval branch and bound and genetic algorithm. Alander [15] suggests two ways in combining the two algorithms. The first is replacing, at least partly, the function to be optimized by some of its interval extensions.Theother is usingGA in some internal problems of the interval algorithm. Sotiropoulos et al. [16] use an interval branch and bound technique to create subregions and choose the midpoint of each subregion to be individuals in an initial population for GA. Zhang and Liu [17] use rejection index (2) as a fitness function for a box Y where f∗ is the best found. The population is taken from N > 1 highest fitness boxes. Mutation is performed on 1/3 of the population with some probability. The best fit box is split intoN subboxes. An attempt to combine the interval branch and bound and simulated annealing for unconstrained problem can be seen in Shary [18].The lower bound ofF is used in calculating the probability of accepting a new box. If this box is accepted, it will be bisected and the half with smaller lower bound of F is chosen to be the next leading box. Otherwise, choose a different box. It is tested on a six-hump camel function and Rastrigin’s function with the domain [−10, 10]2. 3. The Proposed Algorithms We first study the performance of Ichida-Fujii (A2) and Hansen (A3) for a design of our algorithm. A set of unconstrained problems in Appendix A is used. Tables 1 and 2 show that Ichida-Fujii is more effective than Hansen in terms of the speed (number of iterations), storage (the list length), and cost (number of function evaluations). It appears that 4 Abstract and Applied Analysis Table 1:The value of the best found, the maximum list length, the number of function evaluations, and the number of iterations from IchidaFujii A2 and Hansen A3 when n = 10. fn fbest Max length # f eval Number of iterations A2 A3 A2 A3 A2 A3 A2 A3 1 4.80E − 07 9 188 43754 2734 1,600,002 1522 400000 2 8.90E − 07 24.918 38 73340 618 1,600,002 311 400000 3 9.30E − 07 5.60E − 14 22 8970 642 1,482,934 355 370733 4 6.00E − 07 0.156 36 40659 962 1,600,002 527 400000 5 6.00E − 07 7.50E − 08 3 3 962 1,122 479 89588 6 9.20E − 07 1.10E − 13 23 8682 714 916,702 357 229175 7 8.00E − 07 44.761 52 318948 670 1,600,002 336 400000 8 7.70E − 07 1.063 5


Introduction
Many problems in economics, business, sciences, and engineering are modeled as constrained optimization problems: min ∈Ω  () subject to   () ≤ 0  = 1, . . .,   ℎ  () = 0  = 1, . . .,  ℎ . (1) There are various approaches to the problems.Interval algorithms use branch and bound techniques to capture all solutions.One drawback is that they often require more memory and CPU time [1].Stochastic search algorithms are usually easy to implement and no assumption about the continuity and differentiability is required.They are commonly used in numerous fields.Even though there is no guarantee on the solution reported when the algorithm is terminated in a finite time, the theoretical support is only on the convergence in probability.We have an interest in combining the interval branch and bound technique with stochastic search.We first study the combined algorithms for unconstrained problems and then modify them to handle the constrained problems.
The paper is organized as follows.In the next section, the related algorithms are introduced, namely, the interval branch and bound, simulated annealing, and the continuous genetic algorithms.The studies of the improvement of the described algorithms are also discussed.Our proposed algorithms are presented in Section 3. We demonstrate two algorithms for unconstrained and one for constrained problems.In Section 4, the numerical experiments and the discussion are given, followed by the conclusions in Section 5.

Interval Branch and Bound and Stochastic Algorithms
2.1.Interval Branch and Bound 2.1.1.Unconstrained Optimization.An interval algorithm is a tool using interval arithmetics for finding all solutions of the optimization problems.Before the discussion of the algorithms, let us first introduce the notations that we will use in this paper.
For an unconstrained problem, we study min ∈Ω ().A working box could be selected to be the one that has been on the list the longest or the one with the least lower bound of its inclusion function .The bisection direction usually is the direction with the maximum width.Box  will be discarded if lb(()) > .The termination conditions can be set by using a prescribed maximum number of iterations, the width of the box, or the width of the interval .A detailed discussion can be found in [2].When the algorithm ended, all optima are contained in the boxes of the list.
The algorithm shows difficulty when the dimension is high or the function is complicated.The width of the boxes in the list decreases very slowly so that the improvement of the best is found.
We describe here two versions of the interval algorithms.The first is proposed by Ichida and Fujii, in which the working box is the box with the least lower bound of .The second is Hansen's algorithm, in which the working box is the oldest box in the list.

Constrained Optimization.
For constrained problems, a box  will be deleted from the list with the condition  < lb(()) only if all points in  are feasible.The feasibility of a box is considered by using a flag vector  = ( 1 , . . .,   ) where  =   +  ℎ .The element   is assigned by the following rules.
For an equality constraint, we will consider the relaxed problem |ℎ  ()| ≤  ℎ and   is set according to the width of  ℎ and the bound of  as follows: If (lb(()) ≤ 0 and ub(()) ≥ 0) The status of a box is taken through the flag vector .If at least one element of  is 2, the status is labeled as 2 (this box will be deleted from the list).If all elements of  are 0, the box is feasible and the status is set to be 0. Other than that the status is 1.Usually, a box with status 1 will be bisected.

Simulated Annealing. Simulated annealing (SA) is a stochastic search technique, analogy with thermodynamics.
There is a mechanism in avoiding entrapment in local optima by allowing an occasional uphill move.It also incorporates a temperature parameter into the procedure, explore more at high temperature, and restrict it when the temperature is low.The basic idea of the search is that, for a given current state  with an energy level   , generate a subsequent state  randomly.If   −   ≤ 0 then accept state  as the current state.Otherwise accept state  with the probability  −(  −  )/ where  is the temperature.For the minimization problem, the solution corresponds to a state of the system and the objective function corresponds to the energy level.Prior to the process, the cooling schedule () and the neighborhood structure must be defined.SA is described in Algorithm 4. A thoroughly discussion can be found in [4].

Population Based Methods.
Population based methods use a population of points in each iteration.One advantage of populations is that if it has multiple optimums it will be captured in its final population.The examples of such techniques are genetic algorithm (GA), ant colony, particle swarm, and differential evolution.They are widely used in business, sciences, and engineering.The simple GA will be described next.
GA imitates the natural evolution involving three processes, creation of the offsprings, selection, and mutation.The creation of offsprings offers the diversification for the search, but the selection process is narrowing it.The mutation prevents the entrapping in the local minimum.GA is outlined is Algorithm 5.
The main idea of the selection process is to choose better individuals for the next generations by considering the fitness of each one.We are now concentrating on minimization problem; thus the better fit individual is the one with the lower value of .Good selection schemes must allow the convergence to the optimal solution without getting caught Some of the methods are, for example, elitist selection (the best individuals of each generation must be selected), the proportional selection (the better fit individuals have higher probability to be selected), the ranking selection (the individuals are ranked according to their fitness and the selection is based on this ranking), and the tournament selection (individuals are divided into subgroups and the members of each group compete against each other, and then only one is selected to be in the new generation).The mutation rate is set up such that () converges to 0 when  is increased.The termination condition can be set to be the maximum number of generations or the unchanged of the best value over a given number of generations.The comparison of performance of the selection schemes including the convergence can be seen in, for example, [5][6][7].
The disadvantage of GA is that there is no guarantee for finding optimal solutions within a finite amount of time and the tuning of parameters such as population size or mutation rate are sometimes based on trial and error.However, there are no requirements on differentiability and continuity.

The Modifications.
The previous described algorithms can be modified for an improvement.Some of them that related to our works are given next.
For the interval branch and bound algorithms, the adjustment can be made to the following aspects for better solutions: (i) an inclusion function: the kite inclusion function [8], (ii) the subdivision of domain: multisection [8][9][10], (iii) the selection of a box for further process: the box with the largest rejection index such as the one defined in Casado et al. [11], where  * () represents the rejection index of a box .
The studies for the better accuracy of solutions and the speed of algorithm when using an interval branch and bound to solve constrained problems are, for example, as follows.
(i) Casado et al. [11] define a rejection index of a box  as in (2) to identify a good candidate box to contain a global minimum.
(ii) Lagouanelle and Soubry [8] use a new inclusion function called kite enclosure.
(iii) Sun and Johnson [12] introduce local sampling strategies to the working box.The convergence proof is presented along with the numerical results.
(iv) Karmakar and Bhunia [13]  There are many approaches in handling constraints for GA.The classifications can be seen in [3,14].We describe those that combine interval branch and bound and genetic algorithm.
Alander [15] suggests two ways in combining the two algorithms.The first is replacing, at least partly, the function to be optimized by some of its interval extensions.The other is using GA in some internal problems of the interval algorithm.
Sotiropoulos et al. [16] use an interval branch and bound technique to create subregions and choose the midpoint of each subregion to be individuals in an initial population for GA.
Zhang and Liu [17] use rejection index (2) as a fitness function for a box  where  * is the best found.The population is taken from  > 1 highest fitness boxes.Mutation is performed on 1/3 of the population with some probability.The best fit box is split into  subboxes.
An attempt to combine the interval branch and bound and simulated annealing for unconstrained problem can be seen in Shary [18].The lower bound of  is used in calculating the probability of accepting a new box.If this box is accepted, it will be bisected and the half with smaller lower bound of  is chosen to be the next leading box.Otherwise, choose a different box.It is tested on a six-hump camel function and Rastrigin's function with the domain [−10, 10] 2 .

The Proposed Algorithms
We first study the performance of Ichida-Fujii (A2) and Hansen (A3) for a design of our algorithm.A set of unconstrained problems in Appendix A is used.Tables 1 and 2 show that Ichida-Fujii is more effective than Hansen in terms of the speed (number of iterations), storage (the list length), and cost (number of function evaluations).It appears that searching and bisecting the box with the least lower bound  is a fastest way to reach the optimal solutions.For unconstrained problems if we assume that  is continuous, we know for certain that the box with the least lower bound of  must contain a minimum.The earlier a quality  can be discovered, the faster the unwanted boxes can be deleted.
To obtain a quality , we consider combining SA and GA with an interval branch and bound.SA and GA act as search engine while the interval branch and bound are responsible for keeping all solutions.
When dealing with constrained problem the deletion condition will be in effect only if the box is feasible or infeasible.For most problems, the list will contain a high percentage of the boxes with status 1 which is indeterminate.That means the width of the box must be small enough in order to split feasible from the infeasible region.The situation is even worse when the dimension of the problem is high.
There are two concerns in our algorithm as follows: (i) choosing a potential box to search: as a result, a quality  can be obtained to promote the deletion of unwanted boxes, (ii) bisecting the box to isolate a feasible region.
For unconstrained problem, we can search in a box with the least lower bound of .However, for constrained problem we need a function that provides information about the value of  and the lower bound of  in making a decision about which box to search for a better .Let us first define a function fit() to be the minimum value of  among the feasible points from a box .If the box contains  * , fit() is getting close to  * when the width of  approaches 0. The information of  and lb() is combined by using a function called fun defined by (2) Calculate () and set  = (mid()).
We will first present two algorithms for unconstrained problems to study their efficiency in terms of the speed, storage, and cost.Then the algorithm for constrained problems is described.Our goal is to get information of the search region from the interval branch and bound and then provide it for the continuous genetic algorithm to improve its efficiency.

Hybrid Interval Branch and Bound and Simulated
Annealing for Unconstrained Problems.Our algorithm uses simulated annealing as a mechanism that encourages the search in a promising box, at the same time avoiding entrapment in a local minimum.It is described in Algorithm 6.We choose the box with maximum difference of the best value found so far and the least lower bound of , max  { − lb((  ))}, to be bisected.Since lb(()) <  * < , it might result in being able to discard half of the box.
Algorithm 6 is different from [18] in the selection of a working box.We also evaluate more than one point to update the value of the best found so far, .
The stopping criteria are (()) <   or () <   for all boxes  in listx, or the maximum number of iterations has exceeded a prescribed value .
We use a linear cooling schedule by prescribing the number of iterations at each temperature.The temperature is changed by using the given cooling rate; that is, new temperature = cooling rate * temperature.
For Algorithm 6, the parameters that can be adjusted are the number of the initial boxes   , an initial temperature  init , and the annealing schedule.
Note that the value of fit() can be obtained by performing some iterations of your choice of search algorithm in box .In each iteration, fit value of two boxes,  and , is calculated.Thus,  can also be updated.
The Convergence.The following behaviors can be observed from the mechanism of Algorithm 6.
(i) At an initialization stage, the probability that the box in listx is selected to be a working box  is 1/  .In each iteration every box has a probability of 1/(  − 1) to be a box .Both  and  will be searched by randomly choosing   points and recording the best found in  (Algorithm 7).Thus, () is (1) Set up an initial state  0 , initial temperature , the number of iterations for a fixed temperature   and  = ( 0 ).
(iii) At iteration ,  or  will be bisected depending on whom has a higher value of  − lb().
(iv) It is possible that there is a box  in which the width is not small and survives through iterations.This means () − lb(()) < () − lb((())) for some  where 1 ≤  ≤   .Since () − lb(()) decreases with , this  will be selected to be  sometimes later.Therefore, a sequence of () is not decreasing.
(v) Every box has a nonzero probability to be bisected to make (()) smaller, although those boxes in listx have different size.We can conclude that (()) → 0 as  → ∞ for  ∈ listx.
We can show that all solutions will be in listx after the algorithm successfully terminates.

Integrated Interval Branch and
Bound and GA for Unconstrained Problems.Algorithm 8 will combine a population technique with the interval branch and bound to give a higher probability in obtaining a better value of the best found, .Thus, more boxes will be deleted from the list.The population is selected from   boxes with the least lower bound of  to create a new set of candidates using the linear crossover from Michalewicz's book [3] (Algorithm 10).The information of two points is combined and the two outputs are controlled to be in the search domain.They are not necessary in a set of working boxes.Only the best   points are carried to the next generation.
The parameters that can be adjusted for the performance of Algorithm 8 are the following: the number of boxes   , the number of individuals per generation   , the number of working boxes   , the number of individuals to include  11) Randomly select a number  ∈ Unif(0, 1).(12) if  <  then (13) if   <   then (14) Bisect  in the direction of the maximum length of the edge such that  =  in the new generation   , and the maximum number of iterations.There are also two procedures that can be changed, the process of creating a new set of points and the rule for selecting a new generation (Algorithm 9).

Integrated Interval Algorithm and GA for Constrained
Problems.The major disadvantage of the interval methods is that they require more memory and CPU time than the noninterval algorithms.We propose an algorithm that integrates a known bound from an interval algorithm and the quickness of a genetic algorithm.Of course, a certainty of the solutions is lost.However, an improvement of the quality of the solution and a reduction of the cost are gained.
Let us first introduce additional functions which will be used in the algorithm.ifit box() is 0 if at least one feasible point has been found from box .Otherwise, it is 1.
Elements of the flag vector  corresponding to box  are 0 or 1.Therefore, we define nviol box() to be the sum of the elements of a flag vector .It roughly indicates the amount of constraint violation for a box .The status of a box in the list is either 0 or 1, since a box with status 2 is discarded right after becoming known.For constrained problem fun() is modified.The term (ifit box() + 1) is added, taking care of feasibility.Consider In the case that a feasible point is not discovered in Step 3 of Algorithm 11, the upper bound of () will be assigned to fit().
(  We use a simple GA without mutation in Algorithm 12.The mutation is omitted because GA will be invoked in every iteration.The parameters that related to GA are a number of individuals in each generation and the maximum number of generations.The difference of Algorithm 11 and GA is pointed out next.
In GA, an initial population is randomly chosen from a search domain.Then this population is evolved through the three operators crossover, selection, and mutation.The change of individuals in a population is through the creation of children and mutation process.
In Algorithm 11, GA is performed in every iteration with the assigned value of maximum generations,  .An initial population consists of the best individual, , and the individuals randomly chosen from a given box considered as a potential region for a better solution.In a big picture, it is similar to performing mutation with the rate of one at every  n iteration.The mutation is biased because it is restricted to those in the promising region, which is listx [idsearch].After this, the offsprings and populations are allowed to be in Ω.For Algorithm 11, the convergence is achieved when no box can be discarded.Thus those boxes left in the list have () <   or () <   .

Numerical Results
4.1.Unconstrained Problems.Tables 3 and 6 show the value of  found by Algorithms 2, 3, 4, 6, and 8 for  = 10 and  = 20, respectively.The maximum number of iterations is set to be 400,000 and 600,000.In the tables, A1 stands for Algorithm 1 and similarly for other algorithms.In Algorithms 6 and 8, the maximum length of listx, the number of iterations, and the number of function evaluations are the maximum number taken over ten runs.The tolerance   = 10 −6 is set.The algorithm is successfully terminated with (()) < 10 −6 for all ten runs.Since  is in the range of 10 −7 , the table presents only 1 − 07.
Table 4 displays the maximum length of listx and the number of iterations used in algorithm of Ichida-Fujii (A2).Since A2 uses the least storage, the ratio of the amount of the storage used by the other algorithms and A2 is presented in the tables.For example, in problem 1 the maximum list length of A4 is 4.2 times of the maximum list length of A1, which is about 1714.Table 5 shows the number of function evaluations of A2 and the ratio of the number of function evaluations used by the other algorithms and A2 for  = 10.Tables 6,  7, 8, and 9 present similar results for  = 20, 40 and 100, respectively.The observations from the numerical results are the following.
(3) The hybrid interval SA (A6) can handle a higher dimensional domain better than SA (A4).
(4) Maximum list length used by A6 and A8 is mostly about 1-2 times of the used by A2 for  = 20, 40, 100 even though A8 uses a lot more of function evaluations.It implies that an effort on function evaluations does not contribute much to the reduction of the boxes.However, it shows that the structure of the algorithm can keep the storage under control.It suggests using a small number of sample points or number of individuals in the population.
(5) Algorithms 6 and 8 use a lot more storage than A2 in problem 5 but works better when the dimension is higher.
(6) Even if the algorithm found a high quality of  at an early iteration, it may not be able to discard some boxes right away because those boxes are not small enough that the condition on the value of the lower bound of  will be satisfied.At each iteration only one box is bisected; the removing process is put on hold.
(7) When  = 100, Algorithm 6 does not work for problem 11.The termination is due to the memory before the reasonable result is obtained.
(8) Algorithm 6 still works quite well when  is higher, but the population based method, A8, shows the trouble with the memory.
The result suggests that the structure of the algorithm as in A6 seems to handle the length and the number of function evaluations quite well.However, using population based method captures the best faster.Therefore, the number of populations and the maximum generation must be adjusted for not having to waste too much of the number of function evaluations.This information influences the development of A11 for constrained problems.

Constrained Problems.
The parameters setting for Algorithm 11 is described next.The maximum generation for GA in each iteration is set to be in the range of 15 and 25.The population size is 8-16 depending on the size of the domain.An initial  is 1.Both cooling temperature and a sequence of  are linearly decreasing.A parameter   , line 2 of A11, is set to be 2.In problems 2, 3, and 5, Algorithm 11 is terminated with the condition that no box in the list can be processed (() ≤ 10 −6 or (()) ≤ 10 −6 ).The other problems are terminated with the maximum iterations 10,000.
Table 10 shows information about the problem and the experimental results: the dimension, the number of constraints, the maximum width of the search domain, the error of  found by Algorithm 11, and the error from regular GA.The seventh column is the ratio of the number of function evaluations used by GA to the number used by Algorithm 11.GA usually uses more of the number of function evaluations except for the last two problems that it uses about the same amount of number of function evaluations but Algorithm 11 discovers better solutions.The last column shows the percentage of the reduction of the number of function evaluations when Algorithm 11 is used.
Algorithm 11 successfully found optimal solutions with the condition of no box to be processed in problems 2, 3, and 5.For the other problems, A11 reports better solutions with less number of function evaluations compared with the traditional GA.
Notice that the test problems only consist of inequality constraints.For the equality constraints or the mixed one, the results are not impressive.The branch and bound process does not provide good information about solutions at an early stage.All boxes in the list still have status 1 when the algorithm reaches the maximum number of iterations.

Conclusions
We proposed two hybrid algorithms for unconstrained problems, Algorithms 6 and 8. Metropolis criterion is used for choosing a search box.Algorithm 6 performs the search by random sampling and Algorithm 8 by GA.The box to be bisected is considered by the maximum difference of the best value found by the algorithm and the lower bound of The contribution of our proposed Algorithm 11 is a good structure of the algorithm which gives the following advantages.
(1) It reduces the number of function evaluations of the regular genetic algorithm.
(2) If the problem is not so complicated, the solution is ensured by the branch and bound process.This is the advantage of our hybrid algorithm over GA.If the storage is limited, the quality of the reported solution is still acceptable.Moreover, the region that might contain optimum is still in the list.If required, the local search can be applied to that region.
(3) The branch and bound process, which is easy to implement, is responsible for providing the potential individuals for GA.It can be viewed as acceleration for GA.
One weak point of our algorithms is that the deletion process is not activated until the box is small enough, although a high quality  is found in an early iteration.

Ω
: A search domain  : R  → R ◻() = {() :  ∈ }: The range of  over  I: a set of real compact interval [, ], ,  ∈ R I n : a set of -dimensional column vectors If  ∈ I m , () is the width of the box  = max  =1 {(  )}.  : A tolerance for the width of the box   : A tolerance for the width of the interval  : The best value of all  the algorithm has been encountered : A vector corresponding to  (): A value of  at iteration  lb(): A lower bound of interval  ub(): An upper bound of interval .An interval branch and bound have the procedures given in Algorithm 1.

( 1 )
Put the domain into list.(2) repeat (3) Choose a working box, called , and bisect it into two subboxes  1 and  2 and put them on the list.(4) Delete  from the list.(5) Discard the box in the list if it has no solution.(6) until (the termination criteria hold) Algorithm 1: Interval branch and bound algorithm.in a local minimum.There are many proposed selection techniques and the study of their convergence.

( 3 )
Perform linear crossover using Algorithm 10 and put the output points in  (4) until (the number of points in C is ) (5) return a set  Algorithm 9: Create a new set of points from a given set .

Table 1 :
The value of the best found, the maximum list length, the number of function evaluations, and the number of iterations from Ichida-Fujii A2 and Hansen A3 when  = 10.

Table 2 :
The value of the best found, the maximum list length, the number of function evaluations, and the number of iterations from Ichida-Fujii A2 and Hansen A3 when  = 20.
Reqiure: a set , a number of points  (1) repeat (2) Randomly select two points from .

Table 4 :
The maximum list length from A2 and the ratio of the maximum list length from A3, A6, and A8 to the maximum list length from A2 when  = 10.Also the result of the number of iterations.

Table 5 :
The total number of function evaluations both  and  from A2 and the ratio of the number of function evaluations from A3, A6, and A8 to the number of function evaluations from A2 when  = 10.

Table 7 :
The maximum list length from A2 and the ratio of the maximum list length from A3, A6, and A8 to the maximum list length from A2 when  = 20.Also the result of the total number of function evaluations ( and ).

Table 8 :
The maximum list length from A2 and the ratio of the maximum list length from A6 and A8 to the maximum list length from A2 when  = 40.Also the result of the total number of function evaluations ( and ).

Table 9 :
The maximum list length from A2 and the ratio of the maximum list length from Algorithms A6 and A8 to the maximum list length from A2 when  = 100.Also the result of the total number of function evaluations ( and ).

Table 10 :
The result of constrained problems.The ratio in the seventh column is the ratio of number of functions evaluations used by GA to the number used by A11.The last column is the percentage of the number of function evaluations that can be reduced when using A11.