On a New Faster Implicit Fixed Point Iterative Scheme in Convex Metric Spaces

The purpose of this paper is to consider a new implicit iteration and study its strong convergence, stability, and data dependence. It is proved through numerical examples that newly introduced iteration has better convergence rate than well known implicit Mann iteration as well as implicit Ishikawa iteration and implicit iterations converge faster as compared to corresponding explicit iterations. Applications of implicit iterations to RNN (Recurrent Neural Networks) analysis are also presented.


Introduction
In recent years, numerous papers have been published on explicit iterations in various spaces [1][2][3][4][5][6], but there are very few works on implicit iterations (regarding convergence rate and data dependence) [7][8][9][10][11][12][13]. Implicit iterations have an advantage over explicit iterations for nonlinear problems as they provide better approximation of fixed points and are widely used in many applications when explicit iterations are inefficient.Approximation of fixed points in computer oriented programs by using implicit iterations can reduce the computational cost of the fixed point problems.The study of stability of iterations enjoys a celebrated place in applied sciences and engineering due to chaotic behavior of functions in discrete dynamics and other numerical computations.Data dependence of fixed points is a related and new issue which has been studied by many authors; see [4,14] and references therein.In computational mathematics, it is of theoretical and practical importance to compare the convergence rate of iterations and to find out, if possible, which one of them converges more rapidly to the fixed point.Recent works in this direction are [1,3,4,[15][16][17].In concrete, a fixed point iteration is valuable from a numerical point of view and is useful for applications if it satisfies the following requirements: (a) it converges to fixed point of an operator; (b) it is -stable; (c) it is faster as compared to other iterations existing in the literature; (d) it shows data dependence results.
Motivated by the fact that three-step iterations give better approximation than one-step and two-step iterations [18], we define a new and more general three-step implicit iteration (IN) which satisfies the above requirements.

(N)
Putting   = 1 in (N), we get well known Ishikawa iteration [20,21] in convex metric spaces: where Equivalence form of iteration (IN) in linear space can be written as Putting   = 1 in (IN), we get well known implicit Ishikawa iteration [23]: (II) Putting   =   = 1 in (IN), we get well known implicit Mann iteration [2,6,11,24]: Zamfirescu operators [25] are the most general contractive like operators which have been studied by several authors, satisfying the following condition: for each pair of points ,  in  at least one of the following is true: where , , and  are nonnegative constants satisfying 0 ≤  ≤ 1, 0 ≤ , and  ≤ 1/2.
Also, we use the following definitions and lemmas to achieve our main results.
Definition 1 (see [23]).A map  :  All normed spaces and their subsets are the examples of convex metric spaces.But there are many examples of convex metric spaces which are not embedded in any normed space (see [23,28]).After that several authors extended this concept in many ways; one such convex structure is hyperbolic space which was introduced by Kohlenbach [29] as follows.
The stability of explicit as well as implicit iterations has extensively been studied by various authors [4,7,21,27,[30][31][32] due to its increasing importance in computational mathematics, especially due to revolution in computer programming.The concept of -stability in convex metric space setting was given by Olatinwo [21].
Let {  } ∞ =0 ⊂  be the sequence generated by an iterative scheme involving  which is defined by where  0 ∈  is the initial approximation and ⊂  be an arbitrary sequence and set   = ( +1 ,    ,  ).Then, the iteration ( 9) is said to be -stable with respect to  if and only if lim  → ∞   = 0 implies lim  → ∞   = .Lemma 4 (see [4,15]).If  is a real number such that 0 ≤  < 1 and {∈  } ∞ =0 is a sequence of positive numbers such that lim  → ∞ ∈  = 0, then for any sequence of positive numbers one has lim  → ∞   = 0.
Definition 5 (see [15]).Suppose {  } and {  } are two real convergent sequences with limits  and , respectively.Then Definition 6 (see [15]).Let {  } and {V  } be two fixed point iterations that converge to the same fixed point  on a normed space  such that the error estimates are available, where {  } and {  } are two sequences of positive numbers (converging to zero).If {  } converge faster than {  }, then one says that {  } converge faster to  than {V  }.
Lemma 8 (see [4,14]).Let {  } ∞ =0 be a nonnegative sequence for which there exists  0 ∈  such that, for all  ≥  0 , one has the following inequality: where   ∈ (0, 1), for all  ∈ , ∑ ∞ =1   = ∞, and   ≥ 0 ∀ ∈ . Then, Having introduced the implicit Noor iteration (IN), we use it to prove the results concerning convergence, stability, and convergence rate for contractive condition (7) in convex metric spaces.Also, data dependence result of the same iteration is proved in hyperbolic spaces.Moreover, applications of implicit iterations in RNN analysis will be discussed in the last section.

Convergence and Stability Results of New Implicit Iteration in Convex Metric Spaces
Theorem 9. Let  be a nonempty closed convex subset of a convex metric space  and let  be a quasi-contractive operator satisfying (7) with () ̸ = .Then, for  0 ∈ , the sequence {  } defined by (IN), with ∑(1 −   ) = ∞, converges to the fixed point of .
Therefore, the iteration (IN) is -stable.
Remark 11.As contractive condition ( 7) is more general than those of ( 2)-( 6), the convergence and stability results for implicit iteration (IN) using contractive conditions ( 2)-( 6) can be obtained as special cases.
Remark 12.As implicit Mann iteration (IM) and implicit Ishikawa iteration (II) are special cases of implicit iteration (IN), results similar to Theorems 9 and 10 hold for these iterations as well.

Convergence Rate of Implicit Iterations
Theorem 13.Let  be a nonempty closed convex subset of a convex metric space  and let  be a quasi-contractive operator satisfying (7) with () ̸ = .Then, for  0 ∈ , the sequence {  } defined by (IN) with ∑(1 −   ) = ∞,   ≤  < 1, converges faster than implicit Mann (IM) and implicit Ishikawa (II) iterations to the fixed point of .Moreover, implicit iterations converge faster than the corresponding explicit iterations.

Data Dependence of Implicit Iterations in Hyperbolic Spaces
Theorem 15.Let  :  →  be a mapping satisfying (7).

Implicit Iterations in RNN (Recurrent Neural Networks)
Analysis.Neutral networks are a class of nonlinear functions approximations and stable state is achieved in recurrent autoassociative neural networks using iterations.Here we analyze the convergence speed of implicit iterations in recurrent network and many important results will be studied for decreasing and increasing functions.The achieved results possess multifaceted real line applications and in particular can be helpful to design the inner product kernel of support vector machine with faster convergence rate.(For details about RNN and SVM please refer to [33].)  3.

Table 1 :
Convergence rate comparison of various iterations.

Table 2 :
Convergence of implicit iterations to the fixed point of decreasing function.

Table 3 :
Convergence of implicit iterations to the fixed point of increasing function.Remark 16.Putting   =   = 1 in (IN) and (62), data dependence result of implicit Mann (IM) iteration can be proved easily on the same lines as in Theorem 15.Remark 17. Putting   = 1 in (IN) and (62), data dependence result of implicit Ishikawa (II) iteration can be proved easily on the same lines as in Theorem 15.