Exponential Stability and Numerical Methods of Stochastic Recurrent Neural Networks with Delays

and Applied Analysis 3 Proof. By (H4), there exists a sufficiently small positive constant λ such that


Introduction
It is well known that neural networks have wide range of applications in many fields, such as signal processing, pattern recognition, associative memory, and optimization problems.Stability is one of the main properties of neural networks, which is preconditions in the designs and applications of neural networks.Time delays are unavoidable in neural networks systems, which is frequently the important source of poor performance or instability.Thus, stability analysis of neural networks with various delays has been extensively investigated; see [1][2][3][4][5][6][7][8][9][10].
In real nervous systems, the synaptic transmission is a noisy process brought on by random fluctuations from the release of neurotransmitters and other probabilistic causes [11].Hence, noise should be taken into consideration in modeling.Recently, some sufficient conditions for exponential stability of stochastic delay neural networks have been presented in [12][13][14][15][16][17][18].Similar to stochastic delay differential equations, most of stochastic delay neural networks do not have explicit solutions.Most of existing researches related to the stability analysis of equilibrium point were focused on the appropriate Lyapunov function or functional.However, there is no very effective method to find such Lyapunov function or functional.Thus it is very useful to establish numerical methods for studying the properties of stochastic delay neural networks.There are many papers concerned with the stability of numerical solutions for stochastic delay differential equations ( [19][20][21][22][23][24][25][26][27][28][29] and references therein).But there has been a few literatures about the exponential stability of numerical methods for stochastic delay neural networks.To the best of the authors knowledge, only [30][31][32] studied the exponential stability of numerical methods for stochastic delay Hopfield neural networks.The stability of numerical methods for stochastic delay recurrent neural networks remains open, which motivates this paper.The main aim of the paper is to investigate the mean-square stability (MS stability) of the Euler-Maruyama (EM) method and the split-step backward Euler (SSBE) method for stochastic delay recurrent neural networks.
The remainder of the paper is comprised of four sections.Some notations and the conditions of stability to the analytical solution are given in Section 2. The MS stability of the EM method and the SSBE method is proved in Sections 3 and 4, respectively.In Section 5, an example is provided to illustrate the effectiveness of our theory.

Model Description and Analysis of Analytical Solution
Throughout the paper, unless otherwise specified, we will employ the following notations.Let (Ω, F, {F  } ≥0 , P) be a complete probability space with a filtration {F  } ≥0 satisfying the usual conditions (i.e., it is increasing and is right continuous, while F 0 contains all P-null set) and E[⋅] the expectation operator with respect to the probability measure.Let | ⋅ | denote the Euclidean norm of a vector or the spectral norm of a matrix.Let  > 0 and ([−, 0];   ) denote the family of continuous functions  from [−, 0] to   with the norm ‖‖ = sup{|()| : − ≤  ≤ 0}.Denote by C  F 0 ([−, 0];   ) the family of all bounded F 0 -measurable ([−, 0];   ) valued random variables.We assume () to be a standard Brownian motion defined on the probability space.
To obtain our results, we impose the following standing hypotheses.
Definition 1.The trivial solution of system (1) or system (2) is said to be exponentially stable in mean square if there exists a pair of positive constants  and  such that holds for any .In this case lim Using Itô's formula and nonnegative semimartingale convergence theorem, [12,14] discussed the exponential stability of stochastic delayed neural network.Employing the method of variation parameter and inequality techniques, several sufficient conditions ensuring th moment exponential stability of stochastic delayed recurrent neural networks are derived in [17].With the help of the Lyapunov function and Halanaytype inequality, a set of novel sufficient conditions on meansquare exponential stability of stochastic recurrent neural networks with time-varying delays was established in [18].In this paper, we will give a new sufficient condition to guarantee exponential stability in mean square of stochastic delayed recurrent neural networks (1) by using Itô's formula and inequality techniques.
Then ( 1) is exponentially stable in mean square.
Abstract and Applied Analysis 3 Proof.By (H4), there exists a sufficiently small positive constant  such that Set (, ) =   || 2 ; applying Itô's formula to (, ) along ( 2), we obtain where ( Notice that E() = 0, so we can obtain from the previous inequality which implies lim Then (1) is exponentially stable in mean square.
Now we analyze the stability of EM numerical solution.
The proof is similar to Theorem 7 in [20].

Stability of SSBE Numerical Solution
In this section, we will construct the SSBE scheme to (1) and analyze the stability of the numerical solution.The adaptation of SSBE method to (1) leads to a numerical process of the following type: The notations are same to the definition in (16).Now we present another main results of this paper.

Conclusions
The model of stochastic neural network can be viewed as a special kind of stochastic differential equation; the solution is hard to be explicitly expressed.It not only has the characteristics of the general stochastic differential equations but also has its own features; its stability is connected with the activation functions and the connection weight matrixes.So it is necessary to discuss the stability of stochastic neural network.Different from the previous works on exponential stability of stochastic neural networks, both Lyapunov function method and two numerical methods are used to study the stability of stochastic delay recurrent neural networks.Under the conditions which guarantee the stability of the analytical solution, the EM method and the SSBE method are  proved to be MS stable if the step size meets a certain limit.We can analyze other numerical methods for different types of stochastic delay neural networks in future.