^{1}

^{2}

^{1}

^{2}

Recently, Zhang (2006) proposed a three-term modified HS (TTHS) method for unconstrained optimization problems. An attractive property of the TTHS method is that the direction generated by the method is always descent. This property is independent of the line search used. In order to obtain the global convergence of the TTHS method, Zhang proposed a truncated TTHS method. A drawback is that the numerical performance of the truncated TTHS method is not ideal. In this paper, we prove that the TTHS method with standard Armijo line search is globally convergent for uniformly convex problems. Moreover, we propose a new truncated TTHS method. Under suitable conditions, global convergence is obtained for the proposed method. Extensive numerical experiment show that the proposed method is very efficient for the test problems from the CUTE Library.

Consider the unconstrained optimization problem:

We refer to a book [

The paper is organized as follows. In Section

Recently, Zhang [

Uniformly convex functions: converge globally with the standard Armijo line search (

General functions: converge globally with the strong Wolfe line search (

In order to establish the global convergence of our method, we need the following assumption.

(i) The level set

(ii) In some neighborhood

Under Assumption

Suppose that Assumption

If

The following theorem establishes the global convergence of the TTHS method with the standard Armijo line search (

Suppose that Assumption

We proceed by contradiction. If (

By (

We are going to investigate the global convergence of the TTHS method with the strong Wolfe line search (

Now, we describe a lemma for the search directions, which shows that they change slowly, asymptotically. The lemma is similar to [

Suppose that Assumption

Noting that

Now, we evaluate the quantity

By the strong Wolfe condition (

The next theorem establishes the global convergence of method (

Suppose that Assumption

We assume that the conclusion (

A bound for

A bound on the steps

A bound on the direction

In this section, we report some numerical results. We tested 111 problems that are from the CUTE [

In the numerical experiments, we used the latest version—Source code Fortran 77 Version 1.4 (November 14, 2005) with default parameters. We implemented the method (

We adopt the performance profiles by Dolan and Moré [

The curves in Figures

Performance based on the number of iteration.

Performance based on the number of function evaluations.

Performance based on the number of gradient evaluations.

Performance based on CPU time.

cg-descent: the CG_DSCENT method with the approximate Wolfe line search proposed by Hager and Zhang [

mhs+: the method (

From Figures

The authors are indebted to the anonymous referee for his helpful suggestions which improved the quality of this paper. The authors are very grateful also to Professor W. W. Hager and Dr. H. Zhang for their CG_DESCENT code and line search code. This paper is supported by the NSF of China via Grant 10771057.