We present a global error bound for the projected gradient of nonconvex constrained optimization problems and a local error bound for the distance from a feasible solution to the optimal solution set of convex constrained optimization problems, by using the merit function involved in the sequential quadratic programming (SQP) method. For the solution sets (stationary points set and

This paper is concerned with the following constrained optimization problem:

It is well known that SQP is an important method for solving the problem

Suppose that

For the matrix

Due to the convexity of

Consider the following function:

It is well known that the SQP method has wide and valid application for solving optimization problems (see [

The theory of error bounds has attracted a lot of attention and many good results have been obtained. In particular, [

It should be noted that the merit function

Another main contribution of this paper is the finite termination of a feasible solution sequence; that is, the feasible solution sequence converges finitely to the solution sets (stationary points set and ^{+} variational inequality problems. Recently, [

In this paper, inspired by [

(1) the necessary and sufficient condition of a feasible solution sequence to converge finitely to the solution set for the problem

When the feasible solution set

(2) for the problem

For generalized weak sharp minima, we

(3) suppose the stationary point set of the nonconvex optimization problem

The rest of this paper is organized as follows. In Section

Let

Given a nonempty subset

The

A mapping

For the solution set

We call that a sequence

This subsection mainly deals with the basic properties of the merit function

Given

For any

Left-multiplying the two sides of (

For any

With the preparation of these lemmas, we obtain the following result.

The following conclusions are equivalent:

From conclusions (1) and (2) of Theorem

Suppose that the

For any

Clearly, the optimal solutions of problem

The following result provides an estimation for

For any

Since

In other words, we obtain

On the other hand, Lemma

The following result can be obtained by Theorems

The following conclusions are equivalent:

A global error bound for

For any

Since

Now, we consider the case where the problem

Suppose that

Given any

Suppose that

For the positive

In this section, we will study the necessary and sufficient conditions for the feasible solution sequence of nonconvex optimization problems

First we introduce the concept of nondegenerate set.

Let

Now, we further extend the definition of the nondegeneration. Let

Let

It is easy to verify that the following several propositions expressed are special cases of generalized nondegenerate set.

Let

Let

From [

Let

Here, we give the necessary and sufficient condition for the feasible solution sequence of the nonconvex optimization problems

For the nonconvex optimization problem

Consider the following.

For the the nonconvex optimization problems

The Necessity is obvious. We only need to prove the Sufficiency. Suppose (

For the nonconvex optimization problem

By Proposition

For the nonconvex optimization problem

Here, we have

In the following, we will use the global error bounds of projected gradient, which resulted from last section, to characterize the necessary and sufficient condition of feasible solution sequence terminating finitely by the merit function

For the nonconvex optimization problem

Consider the following.

For the nonconvex optimization problem

Without loss of generality, we assume that

We now show that the sequence

According to the assumption,

For the nonconvex optimization problem

According to Lemma

For the nonconvex optimization problem

According to Proposition

In [

Generally speaking, for a nonconvex optimization problem, (

Let

Now, we further extend the definition of weak sharp minima as follows.

Let

With the same as the set of generalized nondegeneration, it is easy to verify that the following several propositions expressed are special cases of generalized weak sharp minima.

Let

Let

Let the optimal solution set

Here, we give the necessary and sufficient condition for the feasible solution sequence of a nonconvex optimization problem

For the nonconvex optimization problem

Consider the following.

With the same as generalized nondegeneration, according to Propositions

For the nonconvex optimization problem

For the nonconvex optimization problem

For the nonconvex optimization problem

The authors declared that there is no conflict of interests in their submitted paper.

This research was supported by National Natural Science Foundations of China (nos. 11271233 and 10971118) and Natural Science Foundation of Shandong Province (no. ZR2012AM016).