Mean-Square Exponential Stability Analysis of Stochastic Neural Networks with Time-Varying Delays via Fixed Point Method

Thiswork addresses the stability study for stochastic cellular neural networkswith time-varying delays. By utilizing the new research technique of the fixed point theory, we find some new and concise sufficient conditions ensuring the existence and uniqueness as well as mean-square global exponential stability of the solution. The presented algebraic stability criteria are easily checked and do not require the differentiability of delays. The paper is finally ended with an example to show the effectiveness of the obtained results.


Introduction
Cellular neural networks (CNNs), firstly proposed by Chua and Yang in 1988 [1,2], have become a research focus owing to their numerous successful applications in various fields such as optimization, linear and nonlinear programming, associative memory, pattern recognition, and computer vision.Taking into account the finite switching speed of amplifiers in the implementation of neural networks, we see that the time delays are inevitable and therefore a new important model, namely, delayed cellular neural networks (DCNNs), is put forward.
On the other hand, it is noteworthy that, besides delay effects, stochastic and impulsive as well as diffusion effects are also likely to exist in the neural networks.Up to now, there have been a mass of works [3][4][5][6][7][8][9][10][11][12] on the dynamic behaviors of complex CNNs such as impulsive delayed reaction-diffusion CNNs and stochastic delayed reaction-diffusion CNNs.
Referring to the current publications of complex CNNs, we note that Lyapunov theory is always the primary method for the stability analysis.However the unavoidable reality is that there also exist lots of difficulties in the application of corresponding results to specific problems.So it does seem that some new methods are needed to resolve those difficulties.
Encouragingly, the fixed point theory is successfully applied by Burton and other authors to investigate the stability of deterministic systems, followed by some valid conclusions presented; for example, see the monograph [13] and the papers [14][15][16][17][18][19][20][21][22][23][24][25].Furthermore, this new idea is developed to discuss the stability of stochastic (delayed) differential equations, turning out to be effective for the stability analysis of dynamical systems with delays and stochastic effects; see [26][27][28][29][30][31][32].Specifically, in [27][28][29], Luo used the fixed point theory to study the exponential stability of mild solutions for stochastic partial differential equations with bounded delays and with infinite delays.In [30,31,[33][34][35], Sakthivel et al. used the fixed point theory to investigate the asymptotic stability in pth moment of mild solutions to nonlinear impulsive stochastic partial differential equations with bounded delays and with infinite delays.In [32], Luo used the fixed point theory to study the exponential stability of stochastic Volterra-Levin equations.
The motivation of this paper is discussing the feasibility of using the fixed point theory to tackle the stability research of complex CNNs and thereupon enlarging the applications of the fixed point theory as well as enriching the stability theory of complex CNNs.In detail, via Banach contraction mapping principle, studied in this paper is the mean-square global

Preliminaries
Let {Ω, F, } be a complete probability space equipped with some filtration {F  } ≥0 satisfying the usual conditions; that is, the filtration is right continuous and F 0 contains all null sets.Let {(),  ≥ 0} denote a standard Brownian motion defined on {Ω, F, }.  stands for the -dimensional Euclidean space and ‖ ⋅ ‖ represents the Euclidean norm.N ≜ {1, 2, . . ., }. + = [0, ∞).(, ) corresponds to the space of continuous mappings from the topological space  to the topological space .
we derive E(sup The consideration of this paper is based on the following fixed point theorem. Theorem 5 (see [37]).Let Υ be a contraction operator on a complete metric space Θ; then there exists a unique point  ∈ Θ for which Υ() = .

Main Results
In this section, we discuss, by means of the contraction mapping principle stated in Theorem 5, the existence and uniqueness as well as global exponential stability of the solution to ( 1)-( 2) in mean-square sense.Before proceeding, we introduce some assumptions as follows.
Step 2. We need to prove  is contractive.For x = ( 1 (), . . .,   ()) ∈ H and y = ( 1 (), . . .,   ()) ∈ H, we know that, for 0 < ,  < 1, where which implies which implies Therefore, which results in As ∑  =1 √  < 1,  is a contraction mapping and hence there exists a unique fixed point x(⋅) of  in H which means x  (⋅) is the solution of ( 1 Proof.Lemma 8 is the direct corollary of Theorem 6 by choosing  = 1/3 and  = 1/2.Remark 9.The obtained algebraic stability criteria are easily checked and do not require even the differentiability of delays, let alone the monotone decreasing behavior of delays which is necessary in some relevant works.

Example
Consider the following two-dimensional stochastic cellular neural network with time-varying delays: which yields 2√  < 1.From Lemma 8, we conclude this twodimensional stochastic cellular neural network with timevarying delays is mean-square globally exponentially stable.

Conclusions
The main contribution of this work is confirming the feasibility of utilizing the fixed point theory to address the stability research of complex CNNs and thereby enlarging the applications of the fixed point theory as well as enriching the stability theory of complex CNNs.Specifically, by Banach contraction mapping principle with no need for Lyapunov functions, we complete the proof of the existence and uniqueness as well as global exponential stability of solution to stochastic delayed neural networks simultaneously, whereas Lyapunov method cannot do this.The derived algebraic stability criteria are novel and easily checked and do not require the differentiability of delays.As we all know, the fixed point theory has various forms, for example, Krasnosleskii's fixed point theorem.Considering many mathematical models can be transformed into a linear part and other nonlinear parts, our future work is trying to explore the application of Krasnosleskii's fixed point theorem to the stability analysis of complex CNNs.
where  ∈ N and  is the number of the neurons in the neural network.  () corresponds to the state of the th neuron at time .  (⋅),   (⋅) ∈ (, ); moreover,   (  ()) is the activation function of the th neuron at time  and   (  ( −   ())) is the activation function of the th neuron at time  −   ().The constant   represents the connection weight of the th neuron on the th neuron at time  and the constant   > 0 represents the rate with which the th neuron will reset its potential to the resting state when disconnected from the network and external inputs.The constant   represents the connection strength of the th neuron on the th neuron at time  −   (), where   () corresponds to the transmission delay along the axon of the th neuron and satisfies 0 ≤   () ≤ .  (,   (),   ( −   ())) ∈ ( + ×  × , ) denotes the diffusion coefficient.() = ( 1 (), . . .,   ())  ∈   and   () ∈ ([−, 0], ).
(2)2)and   E{‖x  (⋅)‖ The main idea of this proof is based on the fixed point theory rather than Lyapunov method.By using Banach contraction mapping principle with no need for Lyapunov functions, we simultaneously explore the existence and uniqueness as well as global exponential stability of solution to (1)-(14) in mean-square sense, whereas Lyapunov method fails to do this.