COMPLEXITYComplexity1099-05261076-2787Hindawi10.1155/2020/83531548353154Research ArticleA Simplified Hypervolume-Based Evolutionary Algorithm for Many-Objective OptimizationJiHonghttps://orcid.org/0000-0002-2144-1177DaiCaiKuniyaToshikazuSchool of Computer ScienceShaanxi Normal UniversityXi’an 710119Chinasnnu.edu.cn2020682020202005052020080720206820202020Copyright © 2020 Hong Ji and Cai Dai.This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Evolutionary algorithms based on hypervolume have demonstrated good performance for solving many-objective optimization problems. However, hypervolume needs prohibitively expensive computational effort. This paper proposes a simplified hypervolume calculation method which can be used to roughly evaluate the convergence and diversity of solutions. The main idea is to use the nearest neighbors of a particular solution to calculate the volume as the solution’s hypervolume value. Moreover, this paper improves the selection operator and the update strategy of external population according to the simplified hypervolume. Then, the proposed algorithm (SHEA) is compared with some state-of-the-art algorithms on fifteen test functions of CEC2018 MaOP competition, and the experimental results prove the feasibility of the proposed algorithm.

National Natural Science Foundation of China6180612061502290614012636167233461673251China Postdoctoral Science Foundation2015M582606Industrial Research Project of Science and Technology in Shaanxi Province2015GY0162017JQ6063Fundamental Research Funds for the Central UniversitiesGK202003071Natural Science Basic Research Plan in Shaanxi Province of China2016JQ6045
1. Introduction

Multiobjective optimization problems (MOPs) have been applied in numerous real-world applications. A minimized MOP which often has two or three objectives can be defined as follows :(1)minFx=f1x,f2x,,fmx,s.t.xΩ,where ΩRn is an n-dimensional decision space; x=x1,,xnΩ is an n-dimensional decision variable; F:ΩRmm=2or3 contains m interconflicting objective functions.

In the last few decades, multiobjective evolutionary algorithms (MOEAs)  are proposed to solve MOPs. However, when solving MOPs with more than three objectives which can also be recognized as many-objective optimization problems (MaOPs) , these MOEAs encounter challenges. First of all, the proportion of nondominated solutions in candidate solutions rises steeply with the increasing number of objectives, which severely deteriorates the selection pressure toward PF. Secondly, the size of population cannot be arbitrarily large in consideration of computational efficiency. But limited number of solutions is probably far away from each other in high-dimensional objective space, causing the offspring to stay away from parents. Lastly, the computational complexity of performance metrics grows exponentially with the increasing number of objectives.

To solve these problems, there are three main categories of many-objective evolutionary algorithms (MaOEAs). The first category is based on the modified dominance relationship  to enhance the selection pressure to PF. This kind of idea has been widely employed and proved considerable improvement. However, these approaches need more efforts in designing diversity maintenance mechanism to ensure diversity.

The second category uses decomposition-based method to solve MaOPs. The main idea is to decompose a many-objective optimization problem into a set of subproblems and optimize them collaboratively. The most representative algorithms are MOEA/D  and its variants . And there are also some other methods based on decomposition such as MOEA/D-M2M  and DBEA  . These approaches are adept in diversity maintenance and avoiding local optimum but ineffective to address highly irregular PFs.

The third approach is indicator-based evolutionary algorithms. Indicators such as hypervolume  weigh both convergence and diversity of solutions to enhance the selection pressure and guide the search to PF. IBEA , SMS-EMOA , and HypE  are three classical indicator-based evolutionary algorithms. Unfortunately, the computational cost becomes excessively expensive because of the high computational complexity.

So the key question is how to reduce the computational complexity and keep the advantages of the hypervolume indicator at the same time. The major contributions of this paper can be summarized as follows.

A simplified hypervolume calculation method is proposed to roughly evaluate the convergence and diversity of solutions

To enhance the quality of offspring, a new selection operator based on the simplified hypervolume is proposed to choose excellent parents

The simplified hypervolume together with nondomination is used in the new update strategy of external population to store solutions with good convergence and diversity

In the remainder of this paper, Section 2 describes the proposed algorithm. Then, Section 3 mainly presents experimental results and related analysis of the proposed algorithm. At last, conclusions are given in Section 4.

2. The Proposed Algorithm

A simplified hypervolume-based evolutionary algorithm for many-objective optimization (SHEA) is proposed to solve MaOPs. The core part of this paper is a new hypervolume calculation method to roughly evaluate the convergence and diversity of solutions. Furthermore, this new hypervolume is used to improve selection operator and update strategy.

2.1. Simplified Hypervolume

To get the hypervolume value, the normalized population P is sorted by each objective function value. For each solution of each objective function value sequence, the reference point is the maximum of the two points on either side of the particular solution.

Thereafter, the volume between the particular solution and the reference point is calculated. As for the boundary solutions, just calculate the volumes between these boundary solutions and its adjacent solutions and remove the terms when the objective function value of xi is boundary value. So the calculation formula of vij is (2). Figure 1 shows the calculation of vij, and to make it easier to understand, the MOP in Figure 1 has only two objectives.

vij calculation: the shadow is volume between xi and the reference point rij on objective j.

Then, the minimum of the particular solution’s volumes for all objectives is kept as the hypervolume value (3):(2)vij=kLfkrijfkxi+L,(3)shvxi=min1jmvij,where is a small positive number, L=k|  fkximaxfkx|xP,riji=1,,N;j=1,,m and viji=1,,N;j=1,,m are reference point and volume of the ith solution for the jth objective.

When xi is sparse which means the adjacent solutions are far from xi, vij is large. As for convergence, when each function of xi is small which means xi is far from rij, vij is large. So, the convergence and diversity of xi are better when shvxi is larger.

2.2. Selection Operator

The new selection operator aims to choose parents with good convergence and diversity to generate high quality offspring in differential evolution :(4)DExi+Fxir1xir2,randCR,xi,rand>CR,where xir1i=1,,n means the ith dimension of xr1 in decision space; CR is crossover rate; and xir1 and xir2 are selected in T nearest neighbors of solution x.

The new selection operator calculates the nondominated neighbors’ hypervolume and retains the top H solutions. To accelerate convergence, xir1 selects the minimum solution of Tchebycheff function in H retained solutions:(5)minimizexΩgtexλ,z=max1imλifixzi,where λ=λ1,,λmT is a given weight vector and z=z1,,zm is a reference point with zi=minfixxΩ.

Then, randomly choose xir2 in neighbors except for xir1.

2.3. Update Strategy of External Population

The external population is adopted to store solutions with good convergence and diversity. The solutions with smaller hypervolume values of the nondominated solutions of population are deleted to maintain the number of the external population. And to retain diversity, m solutions in external population are selected in boundary solutions; others come from intermediate solutions.

2.4. Steps of the Proposed Algorithm

SHEA works as follows (Algorithm 1):

<bold>Algorithm 1:</bold> The framework of the algorithm SHEA.

Input:

MaOP(1)

A stopping criterion

N: the number of weight vectors

EN: the number of external population

T: the number of weight vectors in the neighborhood of each weight vector, 0<T<N

H: the number of the solutions with largest hypervolume selected in neighbors,0<H<T

λ1,λ2,,λN: a set of N uniformly distributed weight vectors

Output: External population EP

Initialization: Generate an initial population O=x1,x2,,xN randomly; set EP=O; determine Z=z1,,zm by a problem-specific method; determine T closest weight vectors to each vector Bi=i1,,iT,i=1,,N

While the stopping criterion is not met do

Calculate the proposed hypervolume of nondominated solutions.

Fori=1,,Ndo

if rand < J then

E=Bi

else

E=O

end if

Choosexir1 and xir2 from E according to the selection operator in Section 2.2.

Usexir1 and xir2 to generate offspring xnew, and set O=Oxnew.

Usexnewto UpdateZ: For j=1,,m, if zj<minfjxnewxnewO, then set zj=minfjxnewxnewO.

End for

SetEP=EPO.

Use the updated strategy to update O.

Use the updated strategy of external population of Section 2.3 to update EP.

End while

To update the population, the solutions in set O are sorted by nondomination and kept in set SP from the first nondomination rank to the last until the total number of SP is bigger than N. Then, the cosine of solutions in SP and weight vectors are calculated to classify each solution into the corresponding weight vector according to the maximum cosine. For each weight vector λi, when solutions in this kind are not empty, save the solution with minimum Tchebycheff function of modified version:(6)minimizexΩgtexλi,z=max1jmfjxzjλji.

Otherwise, find the solution with the minimum Tchebycheff function in SP.

In the proposed algorithm, 2N+EN solutions (population, offspring, and external population), Bii=1,,N, andN weight vectors need to be stored, so the space complexity is O2N+ENn+ONT+ONm=ON2 (in this paper, n,T,m<N<EN2N). Therefore, the space complexity of SHEA is ON2. The major computation of the proposed algorithm contains the selection operator and the update strategy of external population. Both of them use the simplified hypervolume, which needs ONn basic operations (i.e., +, , ×, ÷, and comparison). So, the selection operator needs ONn basic operations to calculate the simplified hypervolume and ONT basic operations to choose parents. To update the external population, at most OEN+Nn basic operations are needed. Furthermore, the update strategy of population also needs no more than O2NN basic operations. Altogether, the computational complexity of SHEA is ONn+ONT+OEN+Nn+O2NN=ON2.

3. Experimental Study3.1. Experimental Settings

In this section, the proposed algorithm is compared with four state-of-the-art algorithms such as NSGAIII , MOEA/DD , KnEA , and RVEA  on fifteen many-objective benchmark functions (MaF) from CEC2018 MaOP competition . Each problem is tested for 5, 10, 15 objectives. NSGAIII  supplies and updates well-spread reference points adaptively to maintain the diversity among population members. MOEA/DD  exploits the merits of both dominance- and decomposition-based approaches to balance the convergence and diversity of the evolutionary process. KnEA  is a knee point-driven EA to solve MaOPs. RVEA  adopts a scalarization approach named angle-penalized distance to balance convergence and diversity.

All of these fifteen test problems for each algorithm mentioned above are run on PlatEMO , and average data is given over 20 independent runs. In the proposed algorithm, the size of external population EN is about 2N; crossover probability of SBX operator is 1; the size of T is 0.1 N; J is 0.9; CR is 0.5, and F is 0.5. And other settings are the same as the standard of CEC2018 MaOP competition . The algorithms are run on a PC with Intel Core i5-3210M (2.50 GHz for a single core and Windows 7 operating system) by using MATLAB language.

3.2. Performance Metrics

Comparison experiment employs inverted generational distance (IGD)  to judge the performances of these algorithms:(7)IGDP,P=xPdx,PP,where P is a set of points uniformly sampled over the true PF, P is the population obtained from MOEAs, and dx,Pis the Euclidean distance between x and its nearest neighbor in P.

IGD can comprehensively measure the convergence and diversity of population, and when IGD value is smaller, the population is closer to Pareto fronts. For each problem, around 10000 points on Pareto fronts are uniformly sampled to calculate IGD. Besides, in the sense of statistics comparison experiment, use Wilcoxon rank-sum test  whose significance level is set 0.05 to compare algorithms’ mean IGD.

3.3. Comparative Studies

Table 1 shows the mean and standard deviation of IGDs which are given by the five MaOEAs on 5, 10, 15 objectives test problems over 20 independent runs. And the best performance for each test problem is marked in bold while “+,” “=,” “-” mean the proposed algorithm is better than, the same as, and worse than the compared algorithms.

The mean and standard deviation values of IGD obtained by SHEA, NSGAIII, MOEA/DD, KnEA, and RVEA while “+,” “=,” “−” mean SHEA is better than, the same as, and worse than the compared algorithms.

ProblemSHEANSGAIIIMOEA/DDKnEARVEA
MaF1-51.1614e − 1 (1.32e − 3)2.0830e − 1 (1.00e − 2) +3.0206e − 1 (1.07e − 2) +1.3136e − 1 (1.91e − 3) =3.2908e − 1 (6.21e − 2) +
MaF2-59.6828e − 2 (2.82e − 3)1.3012e − 1 (2.56e − 3) =1.3665e − 1 (3.72e − 3) =1.3734e − 1 (3.62e − 3) =1.2736e − 1 (1.42e − 3) =
MaF3-56.5417e − 2 (2.81e − 3)9.7048e − 2 (1.52e − 3) =1.1693e − 1 (1.75e − 3) +1.6971e − 1 (9.55e − 2) +8.0906e − 2 (6.70e − 3) =
MaF4-51.8707e+0 (4.09e − 2)3.2220e + 0 (5.56e − 1) +7.7080e + 0 (2.14e − 1) +2.9005e + 0 (2.52 − 1) +4.8145e + 0 (1.30e − 0) +
MaF5-51.8660e+0 (3.63e − 2)2.5845e + 0 (1.15e + 0) +6.1823e + 0 (1.00e + 0) +2.6408e + 0 (8.04e − 2) +2.3218e + 0 (3.07e − 1) +
MaF6-53.9514e - 3 (4.97e - 4)4.9054e - 2 (9.12e − 3) +7.6075e − 2 (4.42e − 3) +8.0721e − 3 (2.28e − 3) =9.6534e − 2 (3.42e − 2) +
MaF7-52.7153e − 1 (9.43e − 3)3.4413e − 1 (1.20e − 2) +1.7891e + 0 (8.07e − 1) +3.2757e − 1 (8.09e − 3) +4.4844e − 1 (1.10e − 3) +
MaF8-58.9494e − 2 (2.04e − 3)2.1547e − 1 (2.19e − 2) +3.3063e − 1 (3.61e − 2) +2.9874e − 1 (8.77e − 2) +4.8799e − 1 (8.73e − 2) +
MaF9-59.3455e − 2 (3.24e − 3)6.5706e − 1 (1.32e − 1) +2.5294e − 1 (1.34e − 2) +5.6348e − 1 (1.82e − 1) +3.6742e − 1 (7.00e − 2) +
MaF10-55.9695e − 1 (6.92e − 2)4.2888e − 1 (3.81e − 3)  − 5.4591e − 1 (3.64e − 2)  − 5.1174e − 1 (7.85e − 3)  − 4.3054e − 1 (4.90e − 3)  −
MaF11-59.5446e − 1 (1.30e − 1)4.6337e − 1 (1.69e − 3)  − 5.7805e − 1 (9.69e − 3)  − 5.7122e − 1 (2.10e − 2)  − 4.4446e − 1 (8.16e − 3) -
MaF12-51.0185e+0 (1.27e − 2)1.1183e + 0 (4.00e − 3) +1.2858e + 0 (1.43e − 2) +1.2609e + 0 (1.70e − 2) +1.1224e + 0 (2.51e − 3) +
MaF13-51.0079e − 1 (3.64e − 3)2.9677e − 1 (5.22e − 2) +2.4087e − 1 (2.54e − 2) +2.2221e − 1 (2.01e − 2) +6.6957e − 1 (1.36e − 1) +
MaF14-55.7391e − 1 (1.04e − 1)7.9085e − 1 (3.74e − 1) +3.8193e − 1 (8.93e − 2)  − 7.6615e − 1 (2.77e − 1) +7.1001e − 1 (2.02e − 1) +
MaF15-58.3177e − 1 (3.31e − 2)1.0511e + 0 (4.49e − 2) +5.9102e − 1 (4.35e − 2)  − 3.4169e + 0 (1.84e + 0) +6.1117e − 1 (4.50e − 2) −
MaF1-103.0459e − 1 (1.30e − 2)3.1563e − 1 (7.02e − 3) =4.8879e − 1 (4.58e − 2) +2.4052e − 1 (2.33e − 3)  − 6.6423e − 1 (8.71e − 2) +
MaF2-101.9990e − 1 (4.87e − 3)2.3692e − 1 (2.48e − 2) +2.9196e − 1 (7.15e − 2) +1.6521e − 1 (7.85e − 3) =4.8292e − 1 (1.82e − 1) +
MaF3-108.3915e − 2 (2.62e − 3)9.3337e − 1 (3.12e + 0) +1.1204e − 1 (1.03e − 3) =1.1840e + 9 (5.06e + 9) +9.8006e − 2 (5.33e − 3) =
MaF4-109.3360e + 1 (3.79e + 1)1.2678e + 2 (6.34e + 0) +4.3124e + 2 (1.96e + 1) +7.1006e+1 (6.77e+0) −2.3329e + 2 (5.10e + 1) +
MaF5-106.0339e+1 (6.65e+0)1.1938e + 2 (2.75e − 1) +2.9174e + 2 (7.70e + 0) +8.1774e + 1 (5.26e + 0) +1.0827e + 2 (1.73e + 1) +
MaF6-103.9073e − 3 (3.35e − 4)2.1879e − 1 (4.79e − 2) +1.2060e − 1 (8.53e − 3) +8.4648e + 0 (7.78e + 0) +3.3999e − 1 (2.44e − 1) +
MaF7-109.6508e − 1 (1.66e − 2)1.1620e + 0 (7.18e − 2) +2.7026e + 0 (3.72e − 1) +9.4855e − 1 (1.01e − 2) =2.4904e + 0 (3.56e − 1) +
MaF8-101.3864e − 1 (3.50e − 3)4.6344e − 1 (6.16e − 2) +9.1389e − 1 (2.39e − 2) +2.6509e − 1 (3.86e − 2) +1.0244e + 0 (2.14e − 1) +
MaF9-108.9161e − 1 (1.92e − 1)9.5771e − 1 (3.22e − 1) +5.9743e − 1 (2.33e − 3) −8.5718e + 1 (6.92e + 1) +1.1161e + 0 (2.69e − 1) +
MaF10-101.2266e + 0 (7.28e − 2)1.0965e+0 (4.85e − 2) −1.2608e + 0 (2.88e − 2) =1.2117e + 0 (5.05e − 2) =1.1948e + 0 (1.04e − 1) =
MaF11-104.2556e + 0 (6.29e − 1)1.3422e+0 (1.50e − 1) -1.4289e + 0 (1.06e − 2)  − 1.3490e + 0 (4.51e − 2)  − 1.3839e + 0 (4.88e − 2) −
MaF12-104.3224e+0 (5.40e − 2)5.1072e + 0 (1.42e − 1) +6.0813e + 0 (2.79e − 1) +5.2874e + 0 (6.12e − 2) +4.8913e + 0 (4.87e − 2) +
MaF13-101.6901e − 1 (1.21e − 2)4.1263e − 1 (6.56e − 2) +4.4820e − 1 (3.29e − 2) +2.1228e − 1 (2.37e − 2) +9.3878e − 1 (2.89e − 1) +
MaF14-108.7548e − 1 (1.52e − 1)1.4523e + 0 (5.34e − 1) +5.3428e − 1 (6.22e − 2)  − 1.7663e + 2 (2.00e + 2) +6.6978e − 1 (5.91e − 2)  −
MaF15-102.0029e + 0 (1.32e − 1)3.0468e + 0 (4.38e + 0) +1.0133e+0 (6.38e − 2)  − 1.9364e + 1 (8.54e + 0) +1.0594e + 0 (4.47e − 2)  −
MaF1-154.1181e − 1 (1.89e − 2)3.3510e − 1 (7.36e − 3)  − 6.3920e − 1 (4.13e − 2) +3.3929e − 1 (3.83e − 2)  − 7.3558e − 1 (5.35e − 2) +
MaF2-152.2440e − 1 (1.08e − 2)2.4645e − 1 (2.39e − 2) =3.1534e − 1 (3.10e − 2) +1.9241e − 1 (5.26e − 3) =7.8173e − 1 (8.16e − 2) +
MaF3-151.0297e − 1 (2.22e − 2)1.7607e + 0 (4.79e + 0) +1.1719e − 1 (1.28e − 3) =2.2295e + 9 (4.30e + 9) +9.6650e − 2 (7.32e − 3) =
MaF4-151.8657e + 4 (6.70e + 4)4.5055e + 3 (4.36e + 2) −1.5432e + 4 (2.45e + 3) −1.7224e+3 (2.01e+2) −7.7257e + 3 (1.96e + 3) −
MaF5-152.2259e + 3 (4.00e + 2)3.1331e + 3 (3.77e + 1) +7.3038e + 3 (6.71e + 1) +2.0495e+3 (8.e+1)  − 3.2985e + 3 (2.82e + 2) +
MaF6-154.5840e − 3 (7.29e − 4)3.7141e − 1 (1.41e − 1) +1.6153e − 1 (3.43e − 3) +4.8429e + 1 (9.21e + 0) +1.9771e − 1 (1.17e − 1) +
MaF7-151.7351e+0 (4.48e − 2)7.7037e + 0 (9.37e − 1) +3.3764e + 0 (7.80e − 2) +2.4545e + 0 (2.36e − 1) +4.3845e + 0 (1.58e + 0) +
MaF8-151.6721e − 1 (2.97e − 3)4.0978e − 1 (4.32e − 2) +1.5460e + 0 (2.82e − 2) +1.9336e − 1 (8.95e − 3)=1.1960e + 0 (1.92e − 1) +
MaF9-152.0922e − 1 (1.44e − 2)2.2160e + 0 (4.13e + 0) +1.3164e + 0 (2.41e + 0) +5.2669e − 1 (4.32e − 1) +1.6125e + 0 (3.85e − 1) +
MaF10-151.7186e + 0 (6.28e − 2)1.6348e + 0 (8.58e − 2)  − 1.9986e + 0 (4.06e − 2) +1.6232e+0 (5.10e − 2)  − 1.7417e + 0 (9.08e − 2) +
MaF11-158.8230e + 0 (1.22e + 0)1.8586e + 0 (8.91e − 2) −2.1955e + 0 (6.49e − 3) −1.7816e+0 (6.20e − 2) −1.9469e + 0 (8.30e − 2) −
MaF12-158.4176e + 0 (1.66e − 1)8.8410e + 0 (9.87e − 2) +1.1457e + 1 (3.64e − 1) +7.3382e+0 (1.38e − 1)  − 9.1349e + 0 (6.05e − 2) +
MaF13-151.8693e − 1 (1.01e − 2)3.8155e − 1 (1.01e − 1) +6.1376e − 1 (1.12e − 1) +1.5808e − 1 (1.46e − 2) =1.3296e + 0 (4.18e − 1) +
MaF14-152.1776e + 0 (9.69e − 1)1.4120e + 0 (7.14e − 1) −4.4791e − 1 (7.87e − 2)  − 4.2956e + 1 (4.09e + 1) +7.8507e − 1 (2.14e − 1) −
MaF15-154.3682e + 0 (9.00e − 1)7.6508e + 0 (1.22e + 1) +1.1636e + 0 (4.10e − 2) −1.4233e + 2 (2.31e + 1) +1.1431e+0 (3.90e − 2)
+/−/=32/9/429/12/425/11/930/10/5

The bold values indicate best performance.

On all forty-five problems in Table 1, SHEA statistically outperforms the compared algorithms on 21 problems, which reveals the good performance of the proposed algorithm in the form of IGD. NSGAIII, MOEA/DD, KnEA, and RVEA have better behavior than SHEA respectively on four, four, nine, five problems and SHEA does better than NSGAIII, MOEA/DD, KnEA and RVEA on thirty-two, twenty-nine, twenty-five, and thirty problems.

In fifteen MaFs, there are 8 problems (F1, F2, F4, F5, F7, F8, F9, and F15) that have partial PFs. The PF projections of these problems do not fully cover the unit hyperplane. The mean IGDs of SHEA are smaller than those of NSGAIII, MOEA/DD, KnEA, and RVEA on twenty-two, nineteen, sixteen, and twenty problems, separately. And for 6 problems (F3, F10, F11, F12, F13, and F14) with PF projection fully covering the unit hyperplane, there are respectively eleven, eleven, twelve, twelve problems that the mean IGDs of SHEA are smaller than those of NSGAIII, MOEA/DD, KnEA, and RVEA. As for the problem F6 whose PF is degraded, SHEA is superior to NSGAIII, MOEA/DD, KnEA, and RVEA on three problems in the form of IGD. All of these comparison results mentioned above indicate the best overall performance of SHEA on most problems and prove the excellent performance of simplified hypervolume in estimating convergence and diversity.

4. Conclusions

To simplify the calculation of hypervolume, a new simplified hypervolume is proposed to roughly estimate the convergence and diversity of solutions; then the new method is used in the selection operator and the update strategy of external population. And the proposed algorithm indicates good performance according to comparing experimental results with four state-of-the-art algorithms.

Data Availability

The data used to support the findings of this study are available from the corresponding author upon request.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

Acknowledgments

This work was supported by the National Natural Science Foundation of China (nos. 61806120, 61502290, 61401263, 61672334, and 61673251), China Postdoctoral Science Foundation (no. 2015M582606), Industrial Research Project of Science and Technology in Shaanxi Province (nos. 2015GY016 and 2017JQ6063), Fundamental Research Fund for the Central Universities (no. GK202003071), and Natural Science Basic Research Plan in Shaanxi Province of China (no. 2016JQ6045).

LamontG. B.Evolutionary Algorithms for Solving Multi-Objective Problems2007Boston, MA, USASpringer USDebK.PratapA.AgarwalS.MeyarivanT.A fast and elitist multiobjective genetic algorithm: NSGA-IIIEEE Transactions on Evolutionary Computation20026218219710.1109/4235.9960172-s2.0-0036530772DongN.DaiC.An improvement decomposition-based multi-objective evolutionary algorithm using multi-search strategyKnowledge-Based Systems201916357258010.1016/j.knosys.2018.09.0182-s2.0-85053820910ZitzlerE.SimonK.Indicator-based selection in multiobjective searchProceedings of the International Conference on Parallel Problem Solving from NatureSeptember 2004Berlin, GermanySpringer832842ZitzlerE.ThieleL.Multiobjective evolutionary algorithms: a comparative case study and the strength Pareto approachIEEE Transactions on Evolutionary Computation19993425727110.1109/4235.7979692-s2.0-0033318858HughesE. J.Evolutionary many-objective optimisation: many once or one many?Proceedings of 2005 IEEE Congress on Evolutionary ComputationSeptember 2005Edinburgh, UK222227DaiC.WangY.YeM.A new evolutionary algorithm based on contraction method for many-objective optimization problemsApplied Mathematics and Computation201424519120510.1016/j.amc.2014.07.0692-s2.0-84906321032ZouX. F.ChenY.LiuM. Z.KangL. S.A new evolutionary algorithm for solving many-objective optimization problemsIEEE Transactions on Systems, Man, and Cybernetics, Part B200838514021412YangS.LiM.LiuX.ZhengJ.A grid-based evolutionary algorithm for many-objective optimizationIEEE Transactions on Evolutionary Computation201317572173610.1109/tevc.2012.22271452-s2.0-84875528853ZhangQ. F.LiH.MOEA/D: a multiobjective evolutionary algorithm based on decomposi-tionIEEE Transactions on Evolutionary Computation2007116712731JiangS.YangS.An improved multiobjective optimization evolutionary algorithm based on decomposition for complex Pareto frontsIEEE Transactions on Cybernetics201646242143710.1109/tcyb.2015.24031312-s2.0-84925064404ElarbiM.BechikhS.GuptaA.Ben SaidL.OngY.-S.A new decomposition-based NSGA-II for many-objective optimizationIEEE Transactions on Systems, Man, and Cybernetics: Systems20184871191121010.1109/tsmc.2017.26543012-s2.0-85048665250WangL.ZhangQ.ZhouA.GongM.JiaoL.Constrained subproblems in a decomposition-based multiobjective evolutionary algorithmIEEE Transactions on Evolutionary Computation201620347548010.1109/tevc.2015.24576162-s2.0-84973293518LiuH.-L.GuF.ZhangQ.Decomposition of a multiobjective optimization problem into a number of simple multiobjective subproblemsIEEE Transactions on Evolutionary Computation201418345045510.1109/tevc.2013.22815332-s2.0-84901842799AsafuddoulaM.RayT.SarkerR.A decomposition-based evolutionary algorithm for many objective optimizationIEEE Transactions on Evolutionary Computation201519344546010.1109/tevc.2014.23398232-s2.0-84930939687DebK.JainH.An evolutionary many-objective optimization algorithm using reference-point-based nondominated sorting approach, Part I: solving problems with box constraintsIEEE Transactions on Evolutionary Computation201418457760110.1109/tevc.2013.22815352-s2.0-84905581607AsafuddoulaM.SinghH. K.RayT.An enhanced decomposition-based evolutionary algorithm with adaptive reference vectorsIEEE Transactions on Cybernetics201748823212334LinH.ChenL.ZhangQ.DebK.Adaptively allocating search effort in challenging many-objective optimization problemsIEEE Transactions on Evolutionary Computation2018223433448AugerA.BaderJ.BrockhoffD.ZitzlerE.Theory of the hypervolume indicator: optimal μ-distributions and the choice of the reference pointProceedings of the Foundations of Genetic Algorithm X2009Orlando, FL, USA87102ZitzlerE.KnzliS.Indicator-based selection in multiobjective searchLecture Notes in Computer Science2004883284210.1007/978-3-540-30217-9_84BeumeN.NaujoksB.EmmerichM.SMS-EMOA: multiobjective selection based on dominated hypervolumeEuropean Journal of Operational Research200718131653166910.1016/j.ejor.2006.08.0082-s2.0-33947669974BaderJ.ZitzlerE.HypE: an algorithm for fast hypervolume-based many-objective optimizationEvolutionary Computation2011191457610.1162/evco_a_000092-s2.0-79951564654PriceK. V.StornR. M.LampinenJ. A.Differential evolution-A practical approach to global optimizationNatural Computing20051412LiK.DebK.ZhangQ.KwongS.An evolutionary many-objective optimization algorithm based on dominance and decompositionIEEE Transactions on Evolutionary Computation201519569471610.1109/tevc.2014.23733862-s2.0-84959346507ZhangX.TianY.JinY.A knee point-driven evolutionary algorithm for many-objective optimizationIEEE Transactions on Evolutionary Computation201519676177610.1109/tevc.2014.23785122-s2.0-84959345384ChengR.JinY.OlhoferM.SendhoffB.A reference vector guided evolutionary algorithm for many-objective optimizationIEEE Transactions on Evolutionary Computation201620577379110.1109/tevc.2016.25193782-s2.0-84995459752ChengR.LiM.TianY.Benchmark functions for CEC’2018 competition on many objective optimization2017Birmingham, U.KCERCIA, School of Computer Science, University of Birmingham EdgbastonTech. Rep.B15. 2TTTianY.ChengR.ZhangX.JinY.PlatEMO: a MATLAB platform for evolutionary multi-objective optimization [educational forum]IEEE Computational Intelligence Magazine2017124738710.1109/mci.2017.27428682-s2.0-85032221845ZitzlerE.ThieleL.LaumannsM.FonsecaC. M.da FonsecaV. G.Performance assessment of multiobjective optimizers: an analysis and reviewIEEE Transactions on Evolutionary Computation20037211713210.1109/tevc.2003.8107582-s2.0-0037936618RobertS.TorrieJ.DickeyD.Principles and Procedures of Statistics: A Biometrical Approach1997New York, NY, USAMcGraw-Hill