A Scaled Conjugate Gradient Method for Solving Monotone Nonlinear Equations with Convex Constraints

cited


Introduction
In this paper, we consider the following convex constrained monotone equations: where  :   →   is a continuous and monotone function.
The feasible region Ω is a nonempty closed convex set.Monotone means that ⟨ () −  () ,  − ⟩ ≥ 0, ∀,  ∈   . ( The algorithms of solving monotone nonlinear equations () = 0 have strong relationship to algorithms of solving optimization problems.It's known that the function () is strictly function is equivalent to that the vector function ∇ is strictly monotone which means (∇() − ∇())  ( − ) > 0, and the definition of monotone nonlinear equations is same to this.The strictly convex function must exists unique minimum point, so the minimum point is a stable point of the convex functions, namely, the point which the gradient vector ∇() = 0.The monotone vector function () = 0 can be seen as a gradient vector of some strictly convex function.There exists strictly convex function (), satisfying ∇() = (), Therefore, solving min () is equivalent to solving () = 0.
Nonlinear monotone equations arise in wide variety of applications, such as subproblems in the generalized proximal algorithms with Bergman distances [1].In power engineering, the operations of a power system are described by a system of nonlinear equations, called the power flow equations, which are constrained by some operating constraints.
It has received much attention for the unconstrained nonlinear monotone equations [2][3][4][5].Solodov and Svaiter [2] proposed a Newton-type method and a good property of the method is that the whole sequence of iterates converges to a solution of the system without any regularity assumptions.Under some weaker conditions, Zhou and Toh [4] showed that the Solodov and Svaiter's method is super linear convergence.Zhou and Li [5,6] extended Solodov and Svaiter's projection method to the BFGS method and limited memory BFGS method.Zhang and Zhou [3] combined the spectral gradient method and the projection method of Solodov and Svaiter, proposed a spectral gradient projection method.Wang et al. [7] extended Solodov and Svaiter's projection method to solve monotone equations with convex constraints.Yu et al. [8] proposed a spectral gradient projection algorithm for monotone nonlinear equations with convex constraints by combining a modified spectral gradient method and the projection method.A good property of the method is that the linear system is not necessary at

The Method
In this section, we propose our method.At first, we simply review the SCALCG method presented by Andrei [10] for the following unconstrained optimization problems.
where  :   →  is a continuously differentiable function,   is its gradient at point   .The method of Andrei generate a sequence {  } of approximations to the minimum  * of , in which where  +1 =      /     .Based on the SCALCG method, we now introduce our method for solving (1).Inspired by (5), we define   as where is a step length which will be defined later.The definition of   is similar to the one in [9].
If  ≥ 1, we obtain where So, we have By the definition of   , the following inequality holds It can be seen that The steps of our method are stated as follows.
Algorithm 2. Consider the following steps.

Convergence Analysis
In this section, we establish the global convergence of Algorithm 2. For our purpose, we assume that  satisfies the following assumptions.
Condition A. Consider the following.
(1) The mapping  is Lipchitz continuous, it means that it satisfies (2) The solution set of (1), denoted by , is nonempty.
Proof.We just need prove that Step 2 is well defined in Algorithm 2. We take the limit of the both sides of (14), we have lim So Algorithm 2 is well defined.Proof.If the algorithm stops at some iteration  then ‖  ‖ = 0, so that   is a solution of (1).From now on, we assume that   ̸ = 0 for any .It is easy to see that   ̸ = 0 from (7).If   ̸ = , by the line search process, we know that    =  −1   does not satisfies (14), that is

Numerical Experiments
In this section, we do some numerical experiments to test the performance of Algorithm 2 on the following two problems.
The algorithm was coded in Matlab and run on a personal computer with a 2.3 GHZ CPU and 2 GB memory and Windows XP operating system.For each test problem, the termination condition is  (  ) ≤ 10 −5 .(55) We set  = 1,  = 0.1,  = 0.0001.We test both problems with the number of variables  = 100, 500, 1000, 2000, and 5000 and start form different initial points.The meaning of the columns in Tables 1 and 2 is stated as follows."Dim" means the dimension of the problem, "Init" means the initials points, "Iter" means the number of iterations, "Time" stands for CPU time in seconds, and "Fn" stands for the final norm of equations.
Tables 1 and 2 show that our method is efficient.It is suitable for solving large-scale monotone equations with convex constraints.

Conclusions
In this paper, we have proposed a SCALCG method for solving nonlinear monotone equations with convex constraints.Under some wild conditions, we proved its global convergence.
Preliminary numerical experiments have illustrated that the proposed method works well for Problems 9 and 10.

Table 1 :
Test results for Problem 9 with given initial points.

Table 2 :
Test results for Problem 10 with given initial points.