^{1}

^{2}

^{1}

^{3}

^{4}

^{5}

^{1}

^{2}

^{3}

^{4}

^{5}

To find a solution of unconstrained optimization problems, we normally use a conjugate gradient (CG) method since it does not cost memory or storage of second derivative like Newton’s method or Broyden–Fletcher–Goldfarb–Shanno (BFGS) method. Recently, a new modification of Polak and Ribiere method was proposed with new restart condition to give a so-call AZPRP method. In this paper, we propose a new modification of AZPRP CG method to solve large-scale unconstrained optimization problems based on a modification of restart condition. The new parameter satisfies the descent property and the global convergence analysis with the strong Wolfe-Powell line search. The numerical results prove that the new CG method is strongly aggressive compared with CG_Descent method. The comparisons are made under a set of more than 140 standard functions from the CUTEst library. The comparison includes number of iterations and CPU time.

The conjugate gradient (CG) method aims to find a solution of optimization problems without constraint. Suppose that the following optimization problem is considered:

To obtain the step length, we normally use the inexact line search, since the exact line search which is defined as follows,

The weak Wolfe-Powell (WWP) line search is defined by (

Powell [

Al

Regarding the speed, memory requirements, number of iterations, function evaluations, gradient evaluations, and robustness to solve unconstrained optimization problems which have prompted the development of the CG method, the readers are advised to refer references [

Alhawarat et al. [

Dai and Laio [

The new formula is a modification of

We obtain the following relations (Algorithm

The level set

In some neighbourhoods

This assumption shows that there exists a positive constant

The descent condition

(

The following theorem demonstrates that

The following theorem shows that

Let

Algorithm

Descent condition is (

By multiplying () by

Divide (

From (

As

The proof is complete.

Let

By multiplying (

Zoutendijk [

Let Assumption

Suppose Assumption

The proof is similar to that presented in [

We will prove the theorem by contradiction. Assume that the conclusion is not true, then a constant

Squaring both sides of equation (

Divide (

Using (

Repeating the process for (

From (

Therefore,

This result contradicts (

To investigate the effectiveness of the new parameter, several test problems in Table

The test functions.

CG_Descent 5.3 | |||||
---|---|---|---|---|---|

Function | Dimension | Number of iterations | CPU time | Number of iterations | CPU time |

AKIVA | 2 | 10 | 0.02 | 8 | 0.02 |

ALLINITU | 4 | 12 | 0.02 | 9 | 0.02 |

ARGLINA | 200 | 1 | 0.02 | 1 | 0.02 |

ARGLINB | 200 | 5 | 0.02 | 6 | 0.11 |

ARWHEAD | 5000 | 7 | 0.02 | 6 | 0.03 |

BARD | 3 | 16 | 0.02 | 12 | 0.02 |

BDQRTIC | 5000 | 136 | 0.58 | 161 | 0.75 |

BEALE | 2 | 15 | 0.02 | 11 | 0.02 |

BIGGS6 | 6 | 27 | 0.02 | 24 | 0.02 |

BOX3 | 3 | 11 | 0.02 | 10 | 0.02 |

BOX | 1000 | 8 | 0.08 | 7 | 0.08 |

BRKMCC | 2 | 5 | 0.02 | 5 | 0.02 |

BROWNAL | 200 | 9 | 0.02 | 9 | 0.02 |

BROWNBS | 2 | 13 | 0.02 | 10 | 0.02 |

BROWNDEN | 4 | 16 | 0.02 | 16 | 0.02 |

BROYDN7D | 5000 | 1411 | 5.47 | 64 | 0.37 |

BRYBND | 5000 | 85 | 0.38 | 39 | 0.22 |

CHAINWOO | 4000 | 318 | 0.866 | 379 | 1.08 |

CHNROSNB | 50 | 287 | 0.02 | 340 | 0.02 |

CLIFF | 2 | 18 | 0.02 | 10 | 0.02 |

COSINE | 10000 | 11 | 0.19 | 14 | 0.26 |

CRAGGLVY | 5000 | 103 | 0.45 | 104 | 0.48 |

CUBE | 2 | 32 | 0.02 | 17 | 0.02 |

CURLY10 | 10000 | 47808 | 173.7 | 42454 | 145.16 |

CURLY20 | 10000 | 66587 | 383.94 | 67279 | 366.03 |

CURLY30 | 10000 | 79030 | 639.63 | 74375 | 509.59 |

DECONVU | 63 | 400 | 2.00E − 02 | 227 | 0.02 |

DENSCHNA | 2 | 9 | 0.02 | 6 | 0.02 |

DENSCHNB | 2 | 7 | 0.02 | 6 | 0.02 |

DENSCHNC | 2 | 12 | 0.02 | 11 | 0.02 |

DENSCHND | 3 | 47 | 0.02 | 14 | 0.02 |

DENSCHNE | 3 | 18 | 0.02 | 12 | 0.02 |

DENSCHNF | 2 | 8 | 0.02 | 9 | 0.02 |

DIXMAANA | 3000 | 7 | 0.02 | 6 | 0.02 |

DIXMAANB | 3000 | 6 | 0.02 | 6 | 0.02 |

DIXMAANC | 3000 | 6 | 0.02 | 6 | 0.02 |

DIXMAAND | 3000 | 7 | 0.02 | 8 | 0.02 |

DIXMAANE | 3000 | 222 | 0.33 | 218 | 0.33 |

DIXMAANF | 3000 | 161 | 0.13 | 116 | 0.09 |

DIXMAANG | 3000 | 157 | 0.12 | 173 | 0.14 |

DIXMAANH | 3000 | 173 | 0.22 | 190 | 0.2 |

DIXMAANI | 3000 | 3856 | 4.25 | 3160 | 3.34 |

DIXMAANJ | 3000 | 327 | 0.36 | 360 | 0.39 |

DIXMAANK | 3000 | 283 | 0.28 | 416 | 0.36 |

DIXMAANL | 3000 | 237 | 0.2 | 399 | 0.36 |

DIXON3DQ | 10000 | 10000 | 19.12 | 10000 | 19.12 |

DJTL | 2 | 82 | 0.02 | 75 | 0.02 |

DQDRTIC | 5000 | 5 | 0.02 | 5 | 0.02 |

DQRTIC | 5000 | 17 | 0.03 | 15 | 0.03 |

EDENSCH | 2000 | 26 | 0.03 | 32 | 0.05 |

EG2 | 1000 | 5 | 0.02 | 3 | 0.02 |

EIGENALS | 2550 | 10083 | 178.36 | 7247 | 133.4 |

EIGENBLS | 2550 | 15301 | 237 | 18846 | 290.3 |

EIGENCLS | 2652 | 10136 | 174.19 | 11152 | 186.86 |

ENGVAL1 | 5000 | 27 | 0.06 | 23 | 0.12 |

ENGVAL2 | 3 | 26 | 0.02 | 26 | 0.02 |

ERRINROS | 50 | 380 | 0.02 | 95504 | 2.36 |

EXPFIT | 2 | 13 | 0.02 | 9 | 0.02 |

EXTROSNB | 1000 | 3808 | 1.25 | 2370 | 0.87 |

FLETCBV2 | 5000 | 1 | 0.02 | 1 | 0.02 |

FLETCHCR | 1000 | 152 | 0.05 | 84 | 0.05 |

FMINSRF2 | 5625 | 346 | 1.09E + 00 | 485 | 1.4 |

FMINSURF | 5625 | 473 | 1.51 | 542 | 1.64 |

FREUROTH | 5000 | 25 | 0.11 | 29 | 0.19 |

GENROSE | 500 | 1078 | 0.17 | 2098 | 0.45 |

GROWTHLS | 3 | 156 | 0.02 | 109 | 0.02 |

GULF | 3 | 37 | 0.02 | 33 | 0.02 |

HAIRY | 2 | 36 | 0.02 | 17 | 0.02 |

HATFLDD | 3 | 20 | 0.02 | 17 | 0.02 |

HATFLDE | 3 | 30 | 0.02 | 13 | 0.02 |

HATFLDFL | 3 | 39 | 0.02 | 21 | 0.02 |

HEART6LS | 6 | 684 | 0.02 | 375 | 0.02 |

HEART8LS | 8 | 249 | 0.02 | 253 | 0.02 |

HELIX | 3 | 23 | 0.02 | 23 | 0.02 |

HIELOW | 3 | 14 | 0.02 | 13 | 0.05 |

HILBERTA | 2 | 2 | 0.02 | 2 | 0.02 |

HILBERTB | 10 | 4 | 0.02 | 4 | 0.02 |

HIMMELBB | 2 | 10 | 0.02 | 4 | 0.02 |

HIMMELBF | 4 | 26 | 0.02 | 23 | 0.02 |

HIMMELBG | 2 | 8 | 0.02 | 7 | 0.02 |

HIMMELBH | 2 | 7 | 0.02 | 5 | 0.02 |

HUMPS | 2 | 52 | 0.02 | 45 | 0.02 |

JENSMP | 2 | 15 | 0.02 | 12 | 0.02 |

JIMACK | 35449 | 8314 | 1182.25 | 7297 | 1030.3 |

KOWOSB | 4 | 17 | 0.02 | 16 | 0 |

LIARWHD | 5000 | 21 | 0.03 | 15 | 0.05 |

LOGHAIRY | 2 | 27 | 0.02 | 26 | 0.02 |

MANCINO | 100 | 11 | 0.08 | 11 | 0.08 |

MARATOSB | 2 | 1145 | 0.02 | 589 | 0.02 |

MEXHAT | 2 | 20 | 0.02 | 14 | 0.02 |

MOREBV | 5000 | 161 | 0.41 | 161 | 0.38 |

MSQRTALS | 1024 | 2905 | 8.64 | 2788 | 9.08 |

MSQRTBLS | 1024 | 2280 | 6.91 | 2181 | 6.84 |

NCB20B | 500 | 2035 | 46.36 | 4181 | 70.16 |

NCB20 | 5010 | 879 | 11.83 | 959 | 13 |

NONCVXU2 | 5000 | 6610 | 15.89 | 6379 | 15.92 |

NONDIA | 5000 | 7 | 0.03 | 7 | 0.03 |

NONDQUAR | 5000 | 1942 | 2.45 | 3058 | 3.88 |

OSBORNEA | 5 | 94 | 0.02 | 82 | 0.02 |

OSBORNEB | 11 | 62 | 0.02 | 57 | 0.02 |

PALMER1C | 8 | 11 | 0.02 | 12 | 0.02 |

PALMER1D | 7 | 11 | 0.02 | 10 | 0.02 |

PALMER2C | 8 | 11 | 0.02 | 11 | 0.02 |

PALMER3C | 8 | 11 | 0.02 | 11 | 0.02 |

PALMER4C | 8 | 11 | 0.02 | 11 | 0.02 |

PALMER5C | 6 | 6 | 0.02 | 6 | 0.02 |

PALMER6C | 8 | 11 | 0.02 | 11 | 0.02 |

PALMER7C | 8 | 11 | 0.02 | 11 | 0.02 |

PALMER8C | 8 | 11 | 0.02 | 11 | 0.02 |

PARKCH | 15 | 672 | 29.45 | 823 | 39.39 |

PENALTY1 | 1000 | 28 | 0.02 | 41 | 0.02 |

PENALTY2 | 200 | 191 | 0.05 | 200 | 0.03 |

PENALTY3 | 200 | 99 | 1.78 | 88 | 1.98 |

POWELLSG | 5000 | 26 | 0.02 | 27 | 0.05 |

POWER | 10000 | 372 | 0.76 | 543 | 1.2 |

QUARTC | 5000 | 17 | 0.03 | 15 | 0.02 |

ROSENBR | 2 | 34 | 0.02 | 28 | 0.02 |

S308 | 2 | 8 | 0.02 | 7 | 0.02 |

SCHMVETT | 5000 | 43 | 0.23 | 40 | 0.27 |

SENSORS | 100 | 21 | 0.25 | 50 | 0.8 |

SINEVAL | 2 | 64 | 0.02 | 46 | 0.02 |

SINQUAD | 5000 | 14 | 0.09 | 15 | 0.08 |

SISSER | 2 | 6 | 0.02 | 5 | 0.02 |

SNAIL | 2 | 100 | 0.02 | 61 | 0.02 |

SPARSINE | 5000 | 18358 | 73 | 21328 | 83 |

SPARSQUR | 10000 | 28 | 0.31 | 35 | 0.98 |

SPMSRTLS | 4999 | 203 | 0.59 | 219 | 0.61 |

SROSENBR | 5000 | 11 | 0.02 | 9 | 0.03 |

STRATEC | 10 | 462 | 19.98 | 170 | 6.23 |

TESTQUAD | 5000 | 1577 | 1.52E + 00 | 1573 | 1.42 |

TOINTGOR | 50 | 135 | 0.02 | 120 | 0.02 |

TOINTGSS | 5000 | 4 | 0.02 | 5 | 0.02 |

TOINTPSP | 50 | 143 | 0.02 | 157 | 0.02 |

TOINTQOR | 50 | 29 | 0.02 | 29 | 0.02 |

TQUARTIC | 5000 | 14 | 0.03 | 11 | 0.03 |

TRIDIA | 5000 | 782 | 0.84 | 783 | 1.11 |

VAREIGVL | 23 | 0.02 | 24 | 0.02 | |

VIBRBEAM | 50 | 138 | 0.02 | 98 | 0.02 |

WATSON | 8 | 49 | 0.02 | 61 | 0.02 |

WOODS | 12 | 22 | 0.06 | 22 | 0.03 |

YFITU | 4000 | 84 | 0.02 | 68 | 0.02 |

ZANGWIL2 | 3 | 1 | 0.02 | 1 | 0.02 |

The CG_Descent 5.3 results are obtained by run CG_Descent 6.8 with memory which equals zero. The host computer is an AMD A4-7210 with RAM 4 GB. The results are shown in Figures

Performance measure based on the number of iterations.

Performance measure based on the CPU time.

In this section, we present six-hump camel back function, which is a multimodal function to test the efficiency of the optimization algorithm. The function is defined as follows:

The number of variables (

Six-hump camel back function in 3D.

Finally, note that CG method can be applied in image restoration problems and neural network and others. For more information, the reader can refer to [

In this study, a modified version of the CG algorithm (A) is suggested and its performance is investigated. The modified formula is restarted based on the value of the Lipchitz constant. The global convergence is established by using SWP line search. Our numerical results show that the new coefficient produces efficient and competitive results compared with other methods, such as CG_Descent 5.3. In the future, an application of the new version of CG method will be combined with feed-forward neural network (back-propagation (BP) algorithm) to improve the training process and produce fast training multilayer algorithm. This will help in reducing time needed to train neural network when the training samples are massive.

The data used to support the findings of this study are included within the article.

The authors declare that they have no conflicts of interest regarding the publication of this paper.

The authors would like to thank Universiti Malaysia Terengganu for supporting this work.