A Modified Three-Term Type CD Conjugate Gradient Algorithm for Unconstrained Optimization Problems

Conjugate gradient methods are well-known methods which are widely applied in many practical fields. CD conjugate gradient method is one of the classical types. In this paper, a modified three-term type CD conjugate gradient algorithm is proposed. Some good features are presented as follows: (i) A modified three-term type CD conjugate gradient formula is presented. (ii) ,e given algorithm possesses sufficient descent property and trust region property. (iii) ,e algorithm has global convergence with the modified weak Wolfe–Powell (MWWP) line search technique and projection technique for general function. ,e new algorithm has made great progress in numerical experiments. It shows that the modified three-term type CD conjugate gradient method is more competitive than the classical CD conjugate gradient method.


Introduction
Considering the problem where f: R n ⟶ R is a continuously differentiable function. is kind of model is often used to solve some problems in applied mathematics, economics, engineering, and so on. Generally, the following iteration formula is used to generate the next iteration point: where x k+1 and x k denote the next iteration point and current iteration point, respectively. α k is a step-length and d k is the search direction. e search direction d k generated by Conjugated Gradient (CG) method is defined by the following formula: where g k � g(x k ) is the gradient of f(x) at x k and β k is a parameter. Different β k will generate different CG methods [1][2][3][4][5][6][7][8][9][10]. ere are six classical forms of β k : [8,9], β HS k � g T k+1 y k y T k d k , [6], [7], [4], where y k � g k+1 − g k , and ‖ · ‖ is the Euclidean norm. ose formulas can be divided into two categories: One includes PRP method, HS method, and LS method, which have good numerical performance; the other includes FR method, CD method, and DY method, which have good theory convergence. About these methods, many scholars had applied them to solve nonlinear monotone equations and normal optimization problems, and some good results were achieved [11][12][13][14][15][16]. Zhang et al. [17] presented a modified PRP CG formula as follows: ey proved that the modified PRP method is globally convergent with Armijo-type line search.
Yuan et al. [18] proposed another modified PRP CG formula with where c k � (θ k /min η 1 s T k y k , η 2 θ k ), θ k � (s T k y k /‖s k ‖ 2 ), s k � x k+1 − x k , η 1 ∈ (0, 1). and η 2 ∈ (0, 1). Yuan et al. [18] obtained the global convergence of the modified PRP method with a modified weak Wolfe-Powell (MWWP) line search technique, which was proposed by Yuan et al. [19]: where λ ∈ (0, (1/2)), λ 1 ∈ (0, λ), and μ ∈ (λ, 1). Yuan et al. [16] obtained the global convergence of PRP method by using the above modified weak Wolfe-Powell line search technique and a projection technique: where ψ k � x k + α k d k is the next point and parameter χ > 0. With this projection technique, any unsatisfactory iteration points generated by the normal PRP algorithm will be projected onto a surface to overcome the failure to converge. Motivated by the above researches, a modified threeterm CD conjugate gradient algorithm is presented with (7), (8), and (9) in this paper. Some good properties are obtained as follows: (i) A modified three-term type CD conjugate gradient formula is presented (ii) e given algorithm possesses sufficient descent property and trust region property (iii) e algorithm has global convergence with the MWWP line search technique and projection technique for general function is paper is organized as follows: the next section will introduce the modified CD formula and relative algorithm; section 3 gives the proof of the global convergence of the new algorithm; numerical experiments are given in section 4; some conclusions are presented in section 5. roughout this paper, we use ‖ · ‖ to denote the Euclidean norm, g(x k ) and g(x k+1 ) are replaced by g k and g k+1 , respectively.

Motivation and Algorithm
e convergence of CD conjugate gradient method has been proved [4]; however, the numerical results of this method are worse than the PRP method and others. erefore, it is necessary to propose a new search direction to improve the numerical performance of the CD method. Meanwhile, the sufficient descent property is significant for obtaining the convergence of the conjugate gradient method: en, we also hope that we can propose a new method that possesses this property. Inspired by the above discussion, a modified three-term type CD conjugate gradient formula is designed by the following: where c k � (θ k /min η 2 s T k y k , η 3 θ k ), θ k � ‖s k ‖ 2 , s k � x k+1 − x k , y k � g k+1 − g k , η 1 > 0, and η 2 , η 3 ∈ (0, 1). e steps of the given algorithm are listed as follows.

Convergence Analysis
In this section, we are going to analyse the convergence of the proposed algorithm. e following assumptions are needed.
is twice continuously differentiable and bounded below, and the gradient function g(x) is Lipschitz continuous, which means that there exists a constant L > 0 such that Lemma 1. Let the search direction d k be generated by formula (11), then, the following relations hold: where constants η, ξ > 0.
Using the definition of the parameter c k , we analyse the value of c k in two cases: Case 1. min η 2 s T k y k , η 3 θ k � η 2 s T k y k . Similar to Lemma 1 of [20], s T k y k ≥ ρ‖s k ‖ 2 , where ρ > 0 is a scalar; then Case 2. min η 2 s T k y k , η 3 θ k � η 3 θ k . en Mathematical Problems in Engineering (14) holds. e proof is complete.

Remark 1.
e relation (14) shows that the optimization algorithm possesses the trust region feature. e following theorem is obtained for the global convergence of Algorithm 1.

Theorem 1.
Assume that α k , d k and x k are generated by Algorithm 1 and Lemma 1 holds. en, Proof. From the line search (8), we have the following: According to (ii) of Assumption 1, en, we have the following: Using Lemma 1, From Assumption 1, line search (7) and sufficient descent property (13), Summing these inequalities from k � 0 to +∞, From Assumption 1, it is easy to know that f(x) is bounded. en, ‖g k ‖ ⟶ 0. e proof is complete.

Numerical Results
is section will report the numerical experiments with some classical optimization problems, the nonlinear Muskingum model, and the application in image restoration problems. All the tests are coded in MATLAB R2014a, run on a PC with a 2.50 GHz CPU, and 4.00 GB of memory running the Windows 10 operating system.

Normal Unconstrained Optimization Problems.
In this subsection, the numerical experiments would be done with some test problems from [20], and all test problems are listed in Table 1. We compare Algorithm 1 with the classical CD conjugated method (called Algorithm 2) and the classical PRP conjugated method (called Algorithm 3). e detailed experimental data are list in Table 2. Figures 1-3 show the performance of these three algorithms related to CPU, NI, and NFG. Some columns of Tables 1 and 2 1, e 1 � e 2 � 10 − 5 , and ϵ � 10 − 6 , and the initial search direction d 0 � − g 0 . (viii) Stop rules: the following Himmeblau stop rule [21] is used: For every problem, if the conditions Step 1: Given the initial point Step 2: If ‖g k ‖ ≤ ε, then, stop; otherwise, proceed to the next step.
Step 5: , and go to Step 7; otherwise, go to Step 6.
Step 8: Calculate the search direction d k by (11).
ALGORITHM 1: e modified three-term type CD conjugate gradient algorithm.
‖g(x k )‖ < ε or stop1 < e 2 are satisfied, then the program is stopped. is program is also stopped when the number of iterations is greater than one thousand.
From the detailed experimental data of Table 2, it is obvious to see that most of the problems can be solved quickly. For most of the problems, it takes less CPU time to solve those problems with the proposed algorithm. Meanwhile, progress has also been made in NI and NFG. Generally, the algorithm of the proposed method is promising versus the other algorithms. About the numerical results, the algorithm of Dolan and More [22] will be used to more directly show the performance profiles of these algorithms. In Figure 1, the curve of Algorithm 1 is always above the other algorithms. In Figure 2, the curves have the same trend. Algorithm 1 almost solves about 97% of the test problems in τ � 30, while Algorithm 2 just solves 89.9% and Algorithm 3 just solves 87% in τ � 30. Figure 3 shows a similar trend as Figure 2. All the above pictures show that the modified CD conjugate gradient algorithm is more robust and effective compared with the normal CD method and PRP method. In summary, Algorithm 1 is more competitive versus others.

e Muskingum Model.
It is generally known that parameter estimation is a significant task in engineering applications. e nonlinear Muskingum model will be discussed as a common example of such an application in this subsection. e Muskingum Model [23] is defined by the following:

Image Restoration Problems.
In this subsection, the above algorithms will be applied to image restoration problems. e original images corrupted by impulse noise are treated as objects here. ese problems are regarded as one of the most difficult problems in optimization fields. Related parameter settings are similar to the above subsections, and the program will be stopped when the condition (|‖f k+1 ‖ − ‖f k ‖|/‖f k ‖) < 10 − 3 or (|‖x k+1 ‖ − ‖x k ‖|/ ‖x k ‖) < 10 − 3 holds. e following three images are selected as processing objects: Banoon (512×512), Barbara (512×512), and Lena (512×512). e detail performances are shown in Figures 7-9. e CPU time taken to process images is listed in Table 4.
It is easy to see that all the algorithms are successful for image restoration problems. e results in Table 4 reveal that the CPU time of Algorithm 1 is less than that of other algorithms, whether for $30\%$ noise problems, 50% noise problems, or for 70% noise problems.

Conclusion
In this paper, a modified three-term type CD conjugate gradient algorithm is presented. Some good features are also presented: (i) sufficient descent property holds, (ii) trust region feature also holds, (iii) the algorithm has global convergence with the MWWP line search technique and projection technique for general function, and (iv) numerical results reveal that the new algorithm is more competitive than the normal CD algorithm and PRP algorithm.  In recent years, there have been considerable researches about other types of CG method, while the study of CD method is not enough, and it should not be ignored. We have many works to do in the future: whether this method is suitable for other line search technique (such as Armijo line search, nonmonotone line search), or whether there exist other better modification methods to improve the numerical results of the CD method. All these are worth studying in the next work.

Data Availability
e data used to support the findings of this study are included within the article.

Conflicts of Interest
e authors declare that there are no conflicts of interest regarding the publication of this paper.