SENSITIVITY ANALYSIS FOR PARAMETRIC VECTOR OPTIMIZATION PROBLEMS USING DIFFERENTIAL EQUATIONS APPROACH

A newmethod for obtaining sensitivity information for parametric vector optimization problems (VOP)v is presented, where the parameters in the objective functions and anywhere in the constraints. This method depends on using differential equations technique for solving multiobjective nonlinear programing problems which is very effective in finding many local Pareto optimal solutions. The behavior of the local solutions for slight perturbation of the parameters in the neighborhood of their chosen initial values is presented by using the technique of trajectory continuation. Finally some examples are given to show the efficiency of the proposed method. 2000 Mathematics Subject Classification. Primary 90C29, 90C30, 90C31.


Introduction.
A differential equations approach is presented as a new method for solving equality constrained nonlinear programing problems in [7].This technique was extended in [1] to be applicable for solving multiobjective nonlinear convex or nonconvex programing with equality or inequality constrained problems.The multiobjective nonlinear programing problem is transformed also to a nonlinear autonomous system of differential equations.In fact, the asymptotically stable critical points of the differential system are constrained local Pareto optimal of the original optimization problem.Recently, other results on equality or inequality constrained nonlinear programing problems with fuzzy parameters by using this approach in [2].
In this paper, sensitivity information is obtained by using the idea of the autonomous system of differential equations corresponding to the vector optimization problem (VOP) (see [1]).These information coincide with the explicit representation of the first order partial derivatives of the local solution point and associated Lagrange multipliers to the parametric problem [4].The problem under consideration is a parametric vector optimization problem, where the parameters in the objective functions and anywhere in the constraints.The fundamental equation corresponding to the problem is described in Section 2. By using the technique of trajectory continuation, [5,6,8], the behavior of the local solution for slight perturbation of the parameters in the neighborhood of their chosen initial values is discussed in Section 3. Finally two illustrative examples are given in Section 4.

Problem formulation.
A mathematical programing problem with general perturbation in the objective functions and anywhere in the constraints has the form subject to G(x, v) ≤ 0, where R n is an n-dimensional Euclidean space, possess continuous first and second derivatives.
The corresponding problem with scalar objective and equality constrainted [3], can be written in the form Assume that the matrices A 1 = ∇ x F(x ) and A 2 = ∇ x G(x ) are of full ranks.
From [3], it is well known that the optimal solution x * of P k (ε) is an efficient solution for the vector optimization problem if one of the following conditions is valid: (i) x * solves P k (ε) for every k = 1, 2,...,m, (ii) x * is the unique optimal solution of P k (ε).Recently, in [1] problem (2.4) was solved by using differential equations approach for fixed v = 0 and the fundamental equations was where B is a symmetric nonsingular matrix of order (n From the above autonomous system (2.6), we obtain ) where P = I − P 1 , is a nonsingular matrix.
In [1], it was proved that the matrix P (x ) is a projection operator which projects any vector in R n+m−1 onto M( x), where M is the tangent of the system of constraints at x where, Also it was proved that if (a) x * is a regular point of the constraints, (b) the second order sufficiency conditions are satisfied at x * , (c) there are no degenerate constraint at x * , then any trajectory starting from a point within some neighborhood of the local minimal point x * converges to x * at t → ∞.
3. Sensitivity information.By using the technique of trajectory continuation [5,6,8], we will discuss the behavior of the local solution x * for slight perturbation of the parameters in the neighborhood of their chosen initial values.
The following existence theorem, which is based on the implicit function theorem [4], holds.

where x (v) is a unique local solution for the problem (VOP) v for any v ∈ N(v) satisfying the assumptions (a)-(c).
For any v ∈ N(v) the fundamental equations corresponding to (VOP) v have the following form: and consequently, From equation (2.7a) near x * , one can write Differentiating (2.7b) with respect to x , we get From (3.4) and (3.5), we obtain where the solution near x * becomes x (t) x * + (x (0) − x * )e −t asymptotically stable.
For obtaining sensitivity information for the first order estimation of solutions of a parametric optimization problem, we introduce the following system of differential equations: Then, one can easily obtain where P,B, P and D as in (3.4).
After solving (VOP) v , we may wish to answer the following question: if the problem ) is identically for v near zero under the assumption of Theorem 3.1 and Proposition 3.2 a first order approximation of Sensitivity information (3.10) and (3.11) minimize the computation efforts needed for finding many efficient solutions for parametric problem (VOP) v , v ∈ N(v).

Illustrative examples.
In this section we provide numerical examples to clarify the theory developed in the paper.
To illustrate the application of the general equations (3.10) and (3.11) for obtaining sensitivity information for the first order estimation of the solution of the parametric optimization problem, we write the following.