^{1}

^{2}

^{3}

^{4}

^{1}

^{2}

^{3}

^{4}

One-parameter families of Newton's iterative method for the solution of nonlinear equations and its extension to unconstrained optimization problems are presented in the paper. These methods are derived by implementing approximations through a straight line and through a parabolic curve in the vicinity of the root. The presented variants are found to yield better performance than Newton's method, in addition that they overcome its limitations.

Newton’s method is one of the most fundamental tools in computational mathematics, operations research, optimization, and control theory. It has many applications in management science, industrial and financial research, chaos and fractals, dynamical systems, variational inequalities and equilibrium-type problems, stability analysis, data mining, and even to random operator equations. Its role in optimization theory cannot be overestimated as the method is the basis for the most effective procedures in linear and nonlinear programming. For a more detailed survey, one can refer to [

Let

A large number of iterative methods have been developed for finding out the solution of single variable nonlinear equations as well for the solution of a system of nonlinear equations. One important reason for these methods is that none of them works for all types of problems. For a more detailed survey of these most important methods, many excellent textbooks are available in the literature [

Newton’s method is probably the simplest, most flexible, best known, and most used numerical method. However, as it is well known, a major difficulty in the application of Newton’s method is the selection of initial guess which must be chosen sufficiently close to the true solution in order to guarantee the convergence. Finding a criterion for choosing initial guess is quite cumbersome and the method may fail miserably if at any stage of computation, the derivative of the function

Also for solving nonlinear, univariate, and unconstrained optimization problems, Newton’s method [

The purpose of this paper is to eliminate the defects of Newton’s method by the simple modification of iteration processes. Numerical results indicate that the proposed iterative formulae are effective and comparable to the well-known Newton’s method. Furthermore, the presented techniques have guaranteed convergence unlike Newton’s method and are as simple as this known technique.

In this section, we shall derive two families by applying approximation via a straight line and via a parabolic curve.

Consider the equation of a straight line having slope equal to

The general formula for successive approximations is given by

Consider a parabola in the form

In (

Exploiting the main idea of Mamta et al. [

If letting

In this section, we shall extend the formulae (

Consider the nonlinear optimization problem: minimize

Assume that

Similarly, it is possible to construct a quadratic function

Adopting the same procedure as in exponential iteration formulae, we can also derive exponential quadratically convergent iterative formulae for unconstrained optimization. Recently, Kahya [

Here, we shall present the mathematical proof for the order of convergence of iterative formulae (

Let

Since

Let

Using (

Here we consider some examples to compare the number of iterations needed in the traditional Newton’s method and its modifications, namely, (

Test problem (nonlinear equations).

No. | Examples | Initial guesses | Root | |
---|---|---|---|---|

0.8 | 0.111832559108734 | |||

1.0 | ||||

1.51 | 0.000000000000000 | |||

1.52 | ||||

5.0 | 6.285049438476562 | |||

| 2.0 | 3.000000000000000 | ||

2.8 | ||||

1.0 | ||||

1.5 | 2.000000000000000 | |||

2.5 |

Comparison table for nonlinear equations.

Examples | NM | Method ( | Method ( |
---|---|---|---|

9 | 5 | 5 | |

Fails | 5 | 5 | |

Converges to undesired root | 7 | 5 | |

Converges to undesired root | 7 | 5 | |

Converges to undesired root (9.424696922302246) | 6 | 4 | |

3 | 5 | 3 | |

Divergent | 2 | 2 | |

11 | 11 | 11 | |

Fails | 1 | 1 | |

14 | 7 | 7 | |

6 | 7 | 6 |

Unconstrained optimization problems.

No. | Examples | Initial guesses | Optimum point |
---|---|---|---|

2 | 3 | ||

3.5 | |||

32 | 40.777259826660156 | ||

45 | |||

1 | 2.3542480166260 | ||

3 | |||

−1 | 0.204481452703476 | ||

1 | |||

0.5 | 0.860541462898254 | ||

2.0 |

Comparison table for unconstrained optimization problems.

Examples | NM | Method ( | Method ( | Optimum value |
---|---|---|---|---|

Fails | 1 | 1 | −8 | |

4 | 5 | 4 | ||

4 | 15 | 13 | 3.599765349958602 | |

4 | 10 | 8 | ||

4 | 6 | 4 | −0.580237420585759 | |

3 | 5 | 4 | ||

4 | 6 | 5 | 1.1014507066670358 | |

4 | 5 | 4 | ||

5 | 5 | 5 | 15.804002928482971 | |

4 | 6 | 5 |

In the following problems, we are to find the root of equations in the given interval

This study presents several iterative formulae of second order for solving scalar nonlinear equations and unconstrained optimization problems. The numerical examples considered in Table

The authors would like to thank the reviewers and the academic editor for many valuable comments and suggestions.