A Global Convergence of LS-CD Hybrid Conjugate Gradient Method

Conjugate gradient method is one of the most effective algorithms for solving unconstrained optimization problem. In this paper, a modified conjugate gradient method is presented and analyzed which is a hybridization of known LS and CD conjugate gradient algorithms. Under some mild conditions, the Wolfe-type line search can guarantee the global convergence of the LS-CD method. The numerical results show that the algorithm is efficient.


Introduction
Consider the following nonlinear programs: where   denotes an -dimensional Euclidean space and () is continuously differentiable function.
As you know, conjugate gradient method is a line search method that takes the following form: where   is a descent direction of() at   and is a stepsize obtained by some one-dimensional line search.If   is the current iterate, we denote (  ) ≜   , ∇(  ) ≜   , and ∇ 2 (  ) ≜   , respectively.If   is available and inverse, then   = − −1    leads to the Newton method and   = −  results in the steepest descent method [1].The search direction   is generally required to satisfy      ≤ 0, which guarantees that   is a descent direction of () at   [2].In order to guarantee the global convergence, we sometimes require   to satisfy a sufficient descent condition as follows: where  > 0 is a constant and ‖⋅‖ is the Euclidean norm.In line search methods, the well-known conjugate gradient method has the following form: Different conjugate gradient algorithms correspond to different choices for the parameter   , where   can be defined by or by other formulae.The corresponding methods are called FR (Fletcher-Reeves) [3], PRP (Polak-Ribiére-Polyak) [4,5], DY (Dai-Yuan) [6], CD (conjugate descent [7]), LS (Liu-Storey [8]), and HS (Hestenes-Stiefel [9]) conjugate gradient method, respectively.Although the above mentioned conjugate gradient algorithms are equivalent to each other for minimizing strong convex quadratic functions under exact line search, they Advances in Numerical Analysis have different performance when using them to minimize nonquadratic functions or when using inexact line searches.For general objective function, the FR, DY, and CD methods have strong convergence properties, but they may have modest practical performance due to jamming.On the other hand, the methods of PRP, LS, and HS in general may not be convergent, but they often have better computational performance.
Touati-Ahmed and Storey [10] have given the first hybrid conjugate algorithm; the method is combinations of different conjugate gradient algorithms; mainly it is being proposed to avoid the jamming phenomenon.Recently, some kinds of new hybrid conjugate gradient methods are given in [11][12][13][14][15][16][17].Based on the new method, we focus on hybrid conjugate gradient methods and analyze the global convergence of the methods with Wolfe-type line search.
The rest of this paper is organized as follows.The algorithm is presented in Section 2. In Section 3 the global convergence is analyzed.We give the numerical experiments in Section 4.

Description of Algorithm
Algorithm 1.
Throughout this paper, the following basic assumptions on the objective function are assumed, which have been widely used in the literature to analyze the global convergence of the conjugate gradient methods.
Since {(  )} is decreasing, it is clear that the sequence {  } generated by Algorithm 1 is contained in  0 .

Global Convergence of Algorithm
Now we analyze the global convergence of Algorithm 1.
Proof.From (7), we have In addition, the assumption (H2.2) gives Combing these two relations, we have  Proof.The first statement is easy to show, since the only stopping point is in Step 3. Thus, assume that the algorithm generates an infinite sequence {  }; if the statement is false, there exists a constant  > 0, such that From ( 8), we have Squaring both sides of the above equation, we get that is, From the definitions of  LS  , and  CD  , we have Thus, we can get On the other hand, multiplying    by we obtain Considering that  LS-CD  ≥ 0 and     −1 ≤ 0, we have which indicates that This contradicts the Zoutendijk condition (15).Therefore, the conclusion holds.

Numerical Experiments
In this section, we give the numerical results of Algorithm 1 to show that the method is efficient for unconstrained optimization problems.We set the parameters  = 0.3, and  = 0.7 and use MATLAB 7.0 to test the chosen problems on a PC with 2.10 GHz CPU processor, 1.0 GB RAM memory, and Linux operation system.We also use the condition ‖  ‖ ≤ 10 −6 or It-max > 5000 as the stopping criterion (It-max denotes the maximal number of iterations).When the limit of 5000 function evaluations was exceeded, the run was stopped, which is indicated by "NaN." The problems that we tested are from [17,19].
Prob 1 () = ( 1 + 3  1, 2, and 3 show the computation results.Because conjugate gradient algorithms are devised for solving large-scale unconstrained optimization problems, we chose some large-scale problems from [18] and compared the performance of the hybrid LS-CD method (Algorithm 1 in Section 2) with the LS method and CD method.
From Tables 1, 2, 3, and 4, we see that the performance of Algorithm 1 is better than that of the CD and the LS methods for some problems.Therefore, our numerical experiments show that the algorithm is efficient.

Table 1 :
Test results for CD algorithm.  : the final point;  * : the final value of the objective function; NI: the number of times of iteration for each problem.

Table 2 :
Test results for LS algorithm.

Table 3 :
Test results for LS-CD algorithm.  : the final point;  * : the final value of the objective function; NI: the number of times of iteration for each problem.

Table 4 :
The performance of the LS method, CD method, and LS-CD method.
[18]: the test problem name from[18]; Dim: the problem dimension; NI: the iterations number; NF: the function evaluations number; NG: the gradient evaluations number.