For multiobjective optimization problems, different optimization variables have different influences on objectives, which implies that attention should be paid to the variables according to their sensitivity. However, previous optimization studies have not considered the variables sensitivity or conducted sensitivity analysis independent of optimization. In this paper, an integrated algorithm is proposed, which combines the optimization method SPEA (Strength Pareto Evolutionary Algorithm) with the sensitivity analysis method SRCC (Spearman Rank Correlation Coefficient). In the proposed algorithm, the optimization variables are worked as samples of sensitivity analysis, and the consequent sensitivity result is used to guide the optimization process by changing the evolutionary parameters. Three cases including a mathematical problem, an airship envelope optimization, and a truss topology optimization are used to demonstrate the computational efficiency of the integrated algorithm. The results showed that this algorithm is able to simultaneously achieve parameter sensitivity and a well-distributed Pareto optimal set, without increasing the computational time greatly in comparison with the SPEA method.

Multiobjective optimization is widely used in many practical engineering problems. Instead of a single optimal solution, multiobjective optimization problem (MOOP), with conflicting subobjectives, provides a set of compromise solutions, which is known as Pareto optimal set [

For most multiobjective optimization methods that can obtain the Pareto optimal set in a single run, attention is focused on preventing local optimal or designing individual sorting or fitness assignments. However, one ignored thing is when the influences of different parameters on the model are disparate, and it may be uneconomical to spend a lot of time on the secondary parameters. A wise approach is to give higher priority to those parameters with significant influence on the optimization objectives (a.k.a. “parameters with high sensitivity”).

In this study, a new strategy which combines the methods of parameter sensitivity analysis and multiobjective optimization is proposed. In the process of optimization, parameter sensitivity is updated in real time with no extra analysis sample and then guides optimization by setting the parameter priority.

The rest of this paper consists of the following. Section

MOOP is characterized by many optimization variables. Consider a multiobjective optimization model with five parameters. If each parameter is allocated 8 binary bits, hence the gene is 40 bits long, and its sample space reaches

To consider parameter sensitivity in optimization process, a traditional approach is to conduct a sensitivity analysis at the first step, then to ignore insensitive parameters and retain the sensitive parameters for optimization. However, this approach has two obvious defects:

In order to eliminate the above defects, a new strategy which integrates sensitivity analysis and optimization is proposed. In this strategy, the sample resources are shared between the sensitivity analysis and the optimization. Meanwhile, results of sensitivity analysis can be directly used to guide the optimization process.

Evolutionary algorithms (EA), which are random exploring optimization algorithms based on the idea of the biological evolutionary, are widely used and well suited for MOOP to look for the global optimum [

In this paper, the SPEA is employed, which has been recommended as one of the most efficient multiobjective evolutionary algorithms [

Flowchart of SPEA.

The first key technique of the SPEA is the calculation of clustering during updating of the external population. The nondominated solutions of each generation are stored in the external population. The size of the external population should be limited to avoid approaching infinity during the iteration. When the amount of the external population exceeds the limit capacity

Another key issue, namely, the calculation of individuals fitness, can be obtained as follows. Firstly, the external nondominated individual fitness is defined as a percentage of the individuals covered by it [

A tournament selection mechanism is adopted to choose the individual, from both the external population

The elimination strategy “Fuzzy cluster analysis” is initiated when the number of clusters exceeds the limit capacity

Recession of the Pareto front.

To prevent this phenomenon, the SPEA should be modified slightly.

Sensitivity analysis is used to qualitatively, or quantitatively, evaluate the influence of parameters on the output variables [

However, these existing global sensitivity analysis methods cannot be directly embedded in optimization process. The main reasons are as follows:

Requirement for samples: randomness and unbiasedness are two basic properties of the samples used for traditional sensitivity analysis methods. But the optimization process, based on the GA, can only provide biased samples which are tending towards the optimal set.

Requirement for parameters: the analysis parameters of the traditional methods should follow certain rules. Assuming the sensitivities of parameters

Time-consumption: take Sobol’s method as an example; two sample groups would include thousands of parameters, which implies thousands of combined parameters are needed to determine parameter sensitivity.

In mathematical statistics, parameter sensitivity can be considered to reflect the correlation between the input parameters and output variables. Therefore, the correlation concept is applied in sensitivity analysis to overcome the above disadvantages.

The correlation coefficient can be used for sensitivity analysis, as mentioned in literatures [

In the present investigation, the Spearman Rank Correlation Coefficient (SRCC) is used. The concept of SRCC is inherited from the Pearson product-Moment Correlation Coefficient (PMCC) [

Monotonically increasing transformation invariance and robustness are two important characteristics of the rank correlation coefficient [

Rank is defined as the increasing (or descending) sort value of the raw parameters. If two parameters have the same sort value, an average value will be adopted. Table

An example of ranking.

Raw | Rank | Final rank |
---|---|---|

9.1 | 1 | |

7.0 | 2 | |

2.6 | 4 | |

7.0 | 3 | |

Assume that there exist random variables

For the rank assignment strategy, the SRCC needs to be recalculated when a new individual is generated. For tens of thousands of analyzed individuals, SRCC recalculation is a heavy burden. An alternative approach is to apply dichotomy in sorting the rank, instead of average-rank-strategy for those duplicated variables, which can reduce the time complexity from

The value of SRCC ranges from −1.0 to 1.0, as shown in Figure

Relationship of SRCC value versus variable distribution.

Another characteristic of SRCC is that all the straight lines with different slopes have the same SRCC value, which is 1.0 or −1.0, as shown in Figure

SRCC value of straight lines.

According to the SRCC characteristic, the two straight lines with different slopes have the same sensitivity, as shown in Figure

Curves with different slope.

Output

The sensitivity based on the SRCC is decided as follows: firstly, obtaining an input parameter matrix by random variable technology; secondly, changing the input parameters simultaneously and obtaining the corresponding output variables; and thirdly, statistically analyzing the influence of the input parameters on the output variables. The essence of this method is global sensitivity analysis rather than local sensitivity analysis.

Crossover probability

For an optimization model, which has four variables

Extract the normalized sensitivity, which means that the sum of all the variables sensitivities of the specified objective is 1.0, as listed in Table

Sum the sensitivity of each variable. In this way, the influence of a variable on all the objectives is considered.

It is well known that the value of evolutionary parameters should not be too large to avoid nonconvergence of the optimization process [

A brief example of modifying evolutionary parameters.

Variables | Sensitivity | Correction coefficient | Evolutionary parameters | ||||
---|---|---|---|---|---|---|---|

| | | Sum | | | ||

| 0.56 | 0.52 | 0.36 | 1.44 | 1.00 | 0.400 | 0.020 |

| 0.13 | 0.30 | 0.32 | 0.75 | 0.52 | 0.208 | 0.010 |

| 0.25 | 0.09 | 0.23 | 0.57 | 0.40 | 0.160 | 0.008 |

| 0.06 | 0.09 | 0.09 | 0.24 | 0.17 | 0.068 | 0.003 |

Assume that the original global

In this improved algorithm SRCC-SPEA, the individuals obtained from the optimization process serve as source samples for sensitivity analysis. The results of the SRCC provide the information for optimization priority of the variables. When the sensitivity analysis sample is small, SRCC results can be unstable and deviate far from the true values [

Process flowchart for integrated algorithm.

Three cases are carried out to verify the practical applicability and superiority of the integrated algorithm SRCC-SPEA.

In this section, a mathematical problem is used to verify the accuracy of optimization method SRCC-SPEA and sensitivity analysis method SRCC. The mathematical problem is defined as

All the evolutionary parameters are listed in Table

The evolutionary parameters.

Number of each generation | Number of external population | Max generation | Cross probability | Mutation probability |
---|---|---|---|---|

50 | 50 | 50 | 0.40 | 0.02 |

The theoretical solution set of this model in the first quadrant is defined as

Comparison of Pareto optimal set between SRCC-SPEA and theory.

Figure

Comparison of parameter sensitivity between SRCC and Sobol’s.

A multiobjective optimization model of an airship envelope, whose geometry is shown in Figure

Profile of a biaxial ellipsoid envelope.

The optimization variables are listed in Table ^{4} m^{2}. The value of

The optimization variables.

Optimization variables | Reference value | Constraints |
---|---|---|

| 59.5 | 55.0~64.0 |

| 84.0 | 80.0–88.0 |

| 400 | 350–450 |

| 0.20 | 0.10–0.30 |

| 12.0 | 11.0–13.0 |

The objective function is expressed as ^{3}, volume of the material reflects the envelope self-weight. Structural strain energy and maximum envelope stress are defined to indicate the stiffness and ultimate strength of the envelope. The value of evolutionary parameters refers to Table

The three-dimensional Pareto optimal set of the fiftieth generation is shown in Figure

Pareto optimal set of SRCC-SPEA and SPEA.

Pareto optimal set (

Optimal parameters

The time-consumption, which relates to the computer capacity, is about 5 hours for both SPEA and SRCC-SPEA. But sensitivity of the optimization variables can also be obtained by SRCC-SPEA, and their correction coefficients are listed in Table

Sensitivities and correction coefficients of the optimization variables.

| | | Correction coefficient | |
---|---|---|---|---|

| 7.47% | 8.48% | 7.08% | 0.10 |

| 2.94% | 0.59% | 0.89% | 0.02 |

| 0.90% | 17.36% | 9.17% | 0.12 |

| 87.87% | 63.51% | 72.89% | 1.00 |

| 0.81% | 10.05% | 9.96% | 0.09 |

In this section, the proposed algorithm is demonstrated by a truss topology optimization with discrete variables. The truss simply supported at both ends (see Figure ^{2}. A concentrated force 900 N is applied at the middle of top chord and material density is set as 1700 kg/m^{3}.

Schematic diagram of the truss.

The optimization variables are the truss segment number

The value of optimization variable TYPE.

TYPE value | 0 | 1 | 2 | 3 |
---|---|---|---|---|

Segment number | ||||

| | | | |

| | | | |

| | | | |

Two optimal objectives are minimizing deformation (

The evolutionary parameters.

Number of each generation | Number of external population | Max generation | Cross probability | Mutation probability |
---|---|---|---|---|

60 | 60 | 1000 | 0.40 | 0.02 |

Figure

Optimization process of SRCC-SPEA and SPEA.

Sensitivity of the 29 optimization variables to all the objectives can also be obtained, as listed in Table

Sensitivities and correction coefficients of the optimization variables.

Variables | | | Correction coefficient |
---|---|---|---|

| 13.63% | 18.95% | 1.00 |

TYPE | 5.03% | 10.13% | 0.47 |

| 0.19% | 2.00% | 0.07 |

| 1.28% | 5.14% | 0.20 |

| 11.52% | 6.66% | 0.56 |

| 3.23% | 0.95% | 0.13 |

| 9.89% | 3.78% | 0.42 |

| 12.58% | 0.19% | 0.39 |

| 6.51% | 1.32% | 0.24 |

| 3.44% | 1.06% | 0.14 |

| 3.76% | 2.20% | 0.18 |

| 0.30% | 4.05% | 0.13 |

| 0.59% | 4.08% | 0.14 |

| 2.56% | 2.03% | 0.14 |

| 3.04% | 2.64% | 0.17 |

| 0.08% | 1.18% | 0.04 |

| 1.09% | 1.21% | 0.07 |

| 3.17% | 0.18% | 0.10 |

| 0.81% | 3.14% | 0.12 |

| 3.09% | 2.01% | 0.16 |

| 2.30% | 2.79% | 0.16 |

| 2.20% | 3.88% | 0.19 |

| 2.49% | 5.85% | 0.26 |

| 1.08% | 1.10% | 0.07 |

| 0.61% | 0.67% | 0.04 |

| 1.99% | 0.63% | 0.08 |

| 1.83% | 7.69% | 0.29 |

| 1.29% | 0.73% | 0.06 |

| 0.43% | 3.76% | 0.13 |

The higher number of optimization variables implies the better effect of optimization computation reallocation, corresponding to the more obvious effectiveness of the integrated algorithm SRCC-SPEA. That is why the SRCC-SPEA obtains the even distributed Pareto optimal set much more quickly than SPEA for the truss topology optimization problem with 29 variables. What is more, conducting sensitivity analysis separately for 29 variables using traditional global sensitivity analysis method means considerable computational effort. But the time-consumption of SRCC-SPEA was almost equal to SPEA, which means lots of computational time would be saved.

In this paper, a novel integrated algorithm SRCC-SPEA was proposed based on the improvements of the optimization method SPEA and sensitivity analysis method SRCC. The elimination strategy “Fuzzy cluster analysis” of SPEA is improved to avoid the retreat of the Pareto front and reduce the time complexity. Dichotomy replaces the average-rank-strategy for SRCC rank assignment to reduce the time complexity.

In contrast with traditional evolutionary algorithm SPEA, the characteristics of SRCC-SPEA can be summarized as follows:

The authors declare that they have no competing interests.

The work has been supported by the National Natural Science Foundation of China (Grant no. 51678192).

^{3}: a new supervised swarm-based optimization algorithm