A Simplified Hypervolume-Based Evolutionary Algorithm for Many-Objective Optimization

. Evolutionary algorithms based on hypervolume have demonstrated good performance for solving many-objective optimization problems. However, hypervolume needs prohibitively expensive computational eﬀort. This paper proposes a simpliﬁed hypervolume calculation method which can be used to roughly evaluate the convergence and diversity of solutions. The main idea is to use the nearest neighbors of a particular solution to calculate the volume as the solution’s hypervolume value. Moreover, this paper improves the selection operator and the update strategy of external population according to the simpliﬁed hypervolume. Then, the proposed algorithm (SHEA) is compared with some state-of-the-art algorithms on ﬁfteen test functions of CEC2018 MaOP competition, and the experimental results prove the feasibility of the proposed algorithm.


Introduction
Multiobjective optimization problems (MOPs) have been applied in numerous real-world applications. A minimized MOP which often has two or three objectives can be defined as follows [1]: where Ω ⊆ R n is an n-dimensional decision space; x � (x 1 , . . . , x n ) ∈ Ω is an n-dimensional decision variable; F: Ω ⟶ R m (m � 2 or 3) contains m interconflicting objective functions.
In the last few decades, multiobjective evolutionary algorithms (MOEAs) [2][3][4][5] are proposed to solve MOPs. However, when solving MOPs with more than three objectives which can also be recognized as many-objective optimization problems (MaOPs) [6], these MOEAs encounter challenges. First of all, the proportion of nondominated solutions in candidate solutions rises steeply with the increasing number of objectives, which severely deteriorates the selection pressure toward PF. Secondly, the size of population cannot be arbitrarily large in consideration of computational efficiency. But limited number of solutions is probably far away from each other in high-dimensional objective space, causing the offspring to stay away from parents. Lastly, the computational complexity of performance metrics grows exponentially with the increasing number of objectives.
To solve these problems, there are three main categories of many-objective evolutionary algorithms (MaOEAs). e first category is based on the modified dominance relationship [7][8][9] to enhance the selection pressure to PF. is kind of idea has been widely employed and proved considerable improvement. However, these approaches need more efforts in designing diversity maintenance mechanism to ensure diversity. e second category uses decomposition-based method to solve MaOPs. e main idea is to decompose a manyobjective optimization problem into a set of subproblems and optimize them collaboratively. e most representative algorithms are MOEA/D [10] and its variants [11][12][13]. And there are also some other methods based on decomposition such as MOEA/D-M2M [14] and DBEA [15] [16][17][18]. ese approaches are adept in diversity maintenance and avoiding local optimum but ineffective to address highly irregular PFs. e third approach is indicator-based evolutionary algorithms. Indicators such as hypervolume [19] weigh both convergence and diversity of solutions to enhance the selection pressure and guide the search to PF. IBEA [20], SMS-EMOA [21], and HypE [22] are three classical indicatorbased evolutionary algorithms. Unfortunately, the computational cost becomes excessively expensive because of the high computational complexity.
So the key question is how to reduce the computational complexity and keep the advantages of the hypervolume indicator at the same time. e major contributions of this paper can be summarized as follows.
(1) A simplified hypervolume calculation method is proposed to roughly evaluate the convergence and diversity of solutions (2) To enhance the quality of offspring, a new selection operator based on the simplified hypervolume is proposed to choose excellent parents (3) e simplified hypervolume together with nondomination is used in the new update strategy of external population to store solutions with good convergence and diversity In the remainder of this paper, Section 2 describes the proposed algorithm. en, Section 3 mainly presents experimental results and related analysis of the proposed algorithm. At last, conclusions are given in Section 4.

The Proposed Algorithm
A simplified hypervolume-based evolutionary algorithm for many-objective optimization (SHEA) is proposed to solve MaOPs. e core part of this paper is a new hypervolume calculation method to roughly evaluate the convergence and diversity of solutions. Furthermore, this new hypervolume is used to improve selection operator and update strategy.

Simplified Hypervolume.
To get the hypervolume value, the normalized population P is sorted by each objective function value. For each solution of each objective function value sequence, the reference point is the maximum of the two points on either side of the particular solution.
ereafter, the volume between the particular solution and the reference point is calculated. As for the boundary solutions, just calculate the volumes between these boundary solutions and its adjacent solutions and remove the terms when the objective function value of x i is boundary value. So the calculation formula of v j i is (2). Figure 1 shows the calculation of v j i , and to make it easier to understand, the MOP in Figure 1 has only two objectives. en, the minimum of the particular solution's volumes for all objectives is kept as the hypervolume value (3): where z is a small positive number,

Selection Operator.
e new selection operator aims to choose parents with good convergence and diversity to generate high quality offspring in differential evolution [23]: where x r1 i (i � 1, · · · , n) means the ith dimension of x r1 in decision space; CR is crossover rate; and x r1 i and x r2 i are selected in T nearest neighbors of solution x.
e new selection operator calculates the nondominated neighbors' hypervolume and retains the top H solutions. To accelerate convergence, x r1 i selects the minimum solution of Tchebycheff function in H retained solutions: where λ � (λ 1 , . . . , λ m ) T is a given weight vector and z � ( en, randomly choose x r2 i in neighbors except for x r1 i .

Update Strategy of External Population.
e external population is adopted to store solutions with good convergence and diversity. e solutions with smaller hypervolume values of the nondominated solutions of population are deleted to maintain the number of the external population. And to retain diversity, m solutions in external population are selected in boundary solutions; others come from intermediate solutions.

Steps of the Proposed Algorithm. SHEA works as follows (Algorithm 1):
To update the population, the solutions in set O are sorted by nondomination and kept in set SP from the first nondomination rank to the last until the total number of SP is bigger than N. en, the cosine of solutions in SP and weight vectors are calculated to classify each solution into the corresponding weight vector according to the maximum cosine. For each weight vector λ i , when solutions in this kind are not empty, save the solution with minimum Tchebycheff function of modified version: Otherwise, find the solution with the minimum Tchebycheff function in SP.
In the proposed algorithm, 2N + EN solutions (population, offspring, and external population),

Experimental Settings.
In this section, the proposed algorithm is compared with four state-of-the-art algorithms such as NSGAIII [16], MOEA/DD [24], KnEA [25], and RVEA [26] on fifteen many-objective benchmark functions (MaF) from CEC2018 MaOP competition [27]. Each problem is tested for 5, 10, 15 objectives. NSGAIII [16] supplies and updates well-spread reference points adaptively to maintain the diversity among population members. MOEA/DD [24] exploits the merits of both dominance-and Use the updated strategy to update O. Use the updated strategy of external population of Section 2.3 to update EP. End while ALGORITHM 1: e framework of the algorithm SHEA. Table 1: e mean and standard deviation values of IGD obtained by SHEA, NSGAIII, MOEA/DD, KnEA, and RVEA while "+," "�," "−" mean SHEA is better than, the same as, and worse than the compared algorithms.

Performance Metrics.
Comparison experiment employs inverted generational distance (IGD) [29] to judge the performances of these algorithms: where P * is a set of points uniformly sampled over the true PF, P is the population obtained from MOEAs, and d(x * , P)is the Euclidean distance between x * and its nearest neighbor in P.
IGD can comprehensively measure the convergence and diversity of population, and when IGD value is smaller, the population is closer to Pareto fronts. For each problem, around 10000 points on Pareto fronts are uniformly sampled to calculate IGD. Besides, in the sense of statistics comparison experiment, use Wilcoxon rank-sum test [30] whose significance level is set 0.05 to compare algorithms' mean IGD. Table 1 shows the mean and standard deviation of IGDs which are given by the five MaOEAs on 5, 10, 15 objectives test problems over 20 independent runs. And the best performance for each test problem is marked in bold while "+," "�," "-" mean the proposed algorithm is better than, the same as, and worse than the compared algorithms.

Comparative Studies.
On all forty-five problems in Table 1, SHEA statistically outperforms the compared algorithms on 21 problems, which reveals the good performance of the proposed algorithm in the form of IGD. NSGAIII, MOEA/DD, KnEA, and RVEA have better behavior than SHEA respectively on four, four, nine, five problems and SHEA does better than NSGAIII, MOEA/DD, KnEA and RVEA on thirty-two, twenty-nine, twenty-five, and thirty problems.
In fifteen MaFs, there are 8 problems (F1, F2, F4, F5, F7, F8, F9, and F15) that have partial PFs. e PF projections of these problems do not fully cover the unit hyperplane. e Complexity mean IGDs of SHEA are smaller than those of NSGAIII, MOEA/DD, KnEA, and RVEA on twenty-two, nineteen, sixteen, and twenty problems, separately. And for 6 problems (F3, F10, F11, F12, F13, and F14) with PF projection fully covering the unit hyperplane, there are respectively eleven, eleven, twelve, twelve problems that the mean IGDs of SHEA are smaller than those of NSGAIII, MOEA/DD, KnEA, and RVEA. As for the problem F6 whose PF is degraded, SHEA is superior to NSGAIII, MOEA/DD, KnEA, and RVEA on three problems in the form of IGD. All of these comparison results mentioned above indicate the best overall performance of SHEA on most problems and prove the excellent performance of simplified hypervolume in estimating convergence and diversity.

Conclusions
To simplify the calculation of hypervolume, a new simplified hypervolume is proposed to roughly estimate the convergence and diversity of solutions; then the new method is used in the selection operator and the update strategy of external population. And the proposed algorithm indicates good performance according to comparing experimental results with four state-of-the-art algorithms.

Data Availability
e data used to support the findings of this study are available from the corresponding author upon request.

Conflicts of Interest
e authors declare that they have no conflicts of interest.