Exponential stability in mean square of stochastic delay recurrent neural networks is investigated in detail. By using Itô’s formula and inequality techniques, the sufficient conditions to guarantee the exponential stability in mean square of an equilibrium are given. Under the conditions which guarantee the stability of the analytical solution, the Euler-Maruyama scheme and the split-step backward Euler scheme are proved to be mean-square stable. At last, an example is given to demonstrate our results.
1. Introduction
It is well known that neural networks have wide range of applications in many fields, such as signal processing, pattern recognition, associative memory, and optimization problems. Stability is one of the main properties of neural networks, which is preconditions in the designs and applications of neural networks. Time delays are unavoidable in neural networks systems, which is frequently the important source of poor performance or instability. Thus, stability analysis of neural networks with various delays has been extensively investigated; see [1–10].
In real nervous systems, the synaptic transmission is a noisy process brought on by random fluctuations from the release of neurotransmitters and other probabilistic causes [11]. Hence, noise should be taken into consideration in modeling. Recently, some sufficient conditions for exponential stability of stochastic delay neural networks have been presented in [12–18]. Similar to stochastic delay differential equations, most of stochastic delay neural networks do not have explicit solutions. Most of existing researches related to the stability analysis of equilibrium point were focused on the appropriate Lyapunov function or functional. However, there is no very effective method to find such Lyapunov function or functional. Thus it is very useful to establish numerical methods for studying the properties of stochastic delay neural networks. There are many papers concerned with the stability of numerical solutions for stochastic delay differential equations ([19–29] and references therein). But there has been a few literatures about the exponential stability of numerical methods for stochastic delay neural networks. To the best of the authors knowledge, only [30–32] studied the exponential stability of numerical methods for stochastic delay Hopfield neural networks. The stability of numerical methods for stochastic delay recurrent neural networks remains open, which motivates this paper. The main aim of the paper is to investigate the mean-square stability (MS stability) of the Euler-Maruyama (EM) method and the split-step backward Euler (SSBE) method for stochastic delay recurrent neural networks.
The remainder of the paper is comprised of four sections. Some notations and the conditions of stability to the analytical solution are given in Section 2. The MS stability of the EM method and the SSBE method is proved in Sections 3 and 4, respectively. In Section 5, an example is provided to illustrate the effectiveness of our theory.
2. Model Description and Analysis of Analytical Solution
Throughout the paper, unless otherwise specified, we will employ the following notations. Let (Ω,ℱ,{ℱt}t≥0,𝒫) be a complete probability space with a filtration {ℱt}t≥0 satisfying the usual conditions (i.e., it is increasing and is right continuous, while ℱ0 contains all 𝒫-null set) and 𝔼[·] the expectation operator with respect to the probability measure. Let |·| denote the Euclidean norm of a vector or the spectral norm of a matrix. Let τ>0 and C([-τ,0];Rn) denote the family of continuous functions φ from [-τ,0] to Rn with the norm ∥φ∥=sup{|φ(θ)|:-τ≤θ≤0}. Denote by 𝒞ℱ0b([-τ,0];Rn) the family of all bounded ℱ0-measurable C([-τ,0];Rn) valued random variables. We assume W(t) to be a standard Brownian motion defined on the probability space.
Consider the stochastic delay recurrent neural networks of the form
(1)dxi(t)=[-cixi(t)+∑j=1naijfj(xj(t))+∑j=1nbijgj(xj(t-τj))]dt+∑j=1nσij(t,xj(t),xj(t-τj))dWj(t),t≥0,xi(t)=ξi(t),-τi≤t≤0.
Model (1) can be rewritten in the following matrix-vector form:
(2)dx(t)=[-Cx(t)+Af(x(t))+Bg(xτ(t))]dt+σ(t,x(t),xτ(t))dW(t),t≥0,x(t)=ξ(t),-τ-≤t≤0,
where x(t)=(x1(t),…,xn(t))T∈Rn is the state vector associated with the neurons; C=diag(c1,c2,…,cn)>0 with ci>0 represents the rate with which neuron i will reset its potential to the resting state in isolation when disconnected from the network and the external stochastic perturbation; A=(aij)n×n and B=(bij)n×n denote the connection weight matrix and the delayed connection weight matrix, respectively; fj and gj are activation functions, f(x(t))=(f1(x1(t)), f2(x2(t)),…,fn(xn(t)))T∈Rn, g(xτ(t))=(g1(x1(t-τ1)), g2(x2(t-τ2)),…,gn(xn(t-τn)))T∈Rn, where τj>0 is the transmission delay; τ-=max1≤i≤nτi, ξ(t)=(ξ1(t),…,ξn(t))T∈𝒞ℱ0b([-τ,0];Rn). Moreover, W(t)=(W1(t),W2(t),…,Wn(t))T is an n-dimensional Brown motion defined on the complete probability space (Ω,ℱ,{ℱt}t≥0,𝒫), and σ:R+×Rn×Rn→Rn×n, σ=(σij)n×n, is the diffusion coefficient matrix.
To obtain our results, we impose the following standing hypotheses.
f(0)≡0,g(0)≡0, and σ(t,0,0)≡0.
Both fi(x) and gi(x) satisfy the Lipschitz condition. That is, for each i=1,2,…,n, there exist constants αi>0,βi>0, such that
(3)|fi(x)-fi(y)|≤αi|x-y|,|gi(x)-gi(y)|≤βi|x-y|,∀x,y∈Rn.
σ(t,x,y) satisfies the Lipschitz condition, and there are nonnegative constants μi,νi such that
(4)trace[σT(t,x,y)σ(t,x,y)]≤∑i=1n(μixi2+νiyi2),∀(t,x,y)∈R+×Rn×Rn.
It follows from [33] that under the assumptions (H1)–(H3), system (1) or (2) has a unique strong solution x(t;ξ), and x(t) is a measurable, sample continuous and ℱt-adapted process. Clear, (2) admits the trivial solution x(t,0)≡0.
Definition 1.
The trivial solution of system (1) or system (2) is said to be exponentially stable in mean square if there exists a pair of positive constants λ and K such that
(5)𝔼|x(t,ξ)|2≤K𝔼|ξ|2e-λt,t≥0,
holds for any ξ. In this case
(6)limt→∞sup1tln𝔼|x(t,ξ)|2≤-λ.
Using Itô’s formula and nonnegative semimartingale convergence theorem, [12, 14] discussed the exponential stability of stochastic delayed neural network. Employing the method of variation parameter and inequality techniques, several sufficient conditions ensuring pth moment exponential stability of stochastic delayed recurrent neural networks are derived in [17]. With the help of the Lyapunov function and Halanay-type inequality, a set of novel sufficient conditions on mean-square exponential stability of stochastic recurrent neural networks with time-varying delays was established in [18]. In this paper, we will give a new sufficient condition to guarantee exponential stability in mean square of stochastic delayed recurrent neural networks (1) by using Itô’s formula and inequality techniques.
Theorem 2.
If (1) satisfies (H1)–(H3), and the following holds.
For i=1,2,…,n,
(7)-2ci+∑j=1n|aij|αj+∑j=1n|bij|βj+∑j=1n|aji|αi+∑j=1n|bji|βi+μi+νi<0.
Then (1) is exponentially stable in mean square.
Proof.
By (H4), there exists a sufficiently small positive constant λ such that
(8)λ-2ci+∑j=1n|aij|αj+∑j=1n|bij|βj+∑j=1n|aji|αi+eλτ-∑j=1n|bji|βi+μi+eλτ-νi≤0.
Set V(x,t)=eλt|x|2; applying Itô’s formula to V(x,t) along (2), we obtain
(9)V(x,t)=V(x(0),0)+∫0tλV(x(s),s)ds+M(t)+∫0teλstrace[σT(s,x(s),xτ(s))σ(s,x(s),xτ(s))]ds+∫0t2eλsxT(s)[-Cx(s)+Af(x(s))+Bg(xτ(s))]ds≤V(x(0),0)+∫0tλV(x(s),s)ds+M(t)+2∫0teλs∑i=1n[-cixi2(s)+∑j=1n|aij|αj|xi(s)||xj(s)|+∑j=1n|bij|βj|xi(s)||xj(s-τj)|]ds+∫0teλs∑j=1n[μjxj2(s)+νjxj2(s-τj)]ds≤V(x(0),0)+∫0tλV(x(s),s)ds+M(t)+∫0teλs{∑i=1n[-2ci+∑j=1n|aij|αj+∑j=1n|bij|βj+∑j=1n|aji|αi+μi]xi2(s)+∑j=1n[∑i=1n|bij|βj+νj]xj2(s-τj)}ds,
where
(10)M(t)=∫0t2eλsxT(s)σ(s,x(s),xτ(s))dW(s).
Notice that
(11)∫t-τjteλs|xj(s)|2ds=∫-τjteλs|xj(s)|2ds-∫-τjt-τjeλs|xj(s)|2ds=∫-τjteλs|xj(s)|2ds-e-λτj∫0teλs|xj(s-τj)|2ds≤∫-τjteλs|xj(s)|2ds-e-λτ-∫0teλs|xj(s-τj)|2ds.
Therefore, we have
(12)V(x,t)≤V(x(0),0)+∫0tλV(x(s),s)ds+M(t)+eλτ-∫-τ-0eλs∑j=1n[∑i=1n|bij|βj+νj]|xj(s)|2ds+∫0teλs∑i=1n[-2ci+∑j=1n|aij|αj+∑j=1n|bij|βj+∑j=1n|aji|αi+μi]|xi(s)|2ds+eλτ-∫0teλs∑j=1n[∑i=1n|bij|βj+νj]|xj(s)|2ds≤V(x(0),0)+M(t)+eλτ-∫-τ-0eλs∑j=1n[∑i=1n|bij|βj+νj]|xj(s)|2ds+∫0teλs∑i=1n{|xi(s)|2[λ-2ci+∑j=1n|aij|αj+∑j=1n|bij|βj+∑j=1n|aji|αi+μi+eλτ-∑j=1n|bji|βi+eλτ-νi]}ds≤V(x(0),0)+M(t)+eλτ-∫-τ-0eλs∑j=1n[∑i=1n|bij|βj+νj]|xj(s)|2ds.
Notice that 𝔼M(t)=0, so we can obtain from the previous inequality
(13)𝔼eλt|x(t)|2≤𝔼|x(0)|2+eλτ-∑j=1n[∑i=1n|bij|βj+νj]∫-τ-0eλs𝔼|x(s)|2ds,
which implies
(14)limt→∞sup1tln𝔼|x(t)|2≤-λ.
Corollary 3.
If (1) satisfies (H1)–(H3), the following holds.
For i=1,2,…,n,
(15)-2ci+∑j=1n|aij|αj+∑j=1n|bij|βj+∑j=1n|aji|αi+∑j=1n|bji|βi+∑j=1n(μj+νj)<0.
Then (1) is exponentially stable in mean square.
Proof.
The μj,νj(j=1,…,n) are nonnegative constants, so we have condition (H4) from (H5). Therefore we can directly derive Corollary 3 by Theorem 2.
3. Stability of EM Numerical Solution
Let h=tk+1-tk and ΔWik=Wi(tk+1)-Wi(tk) denote the increments of the time and Brownian motion, respectively. For system (1), the discrete EM approximate solution is defined by
(16)yik+1=yik+[-ciyik+∑j=1naijfj(yjk)+∑j=1nbijgj(yjk-mj)]h+∑j=1nσij(k,yjk,yjk-mj)ΔWjk,
where i=1,2,…,n,h(0<h<1) is a stepsize which satisfies τj=mjh for a positive integer mj, and tk=kh,yik is an approximation to xi(tk); if tk≤0, we have yik=ξi(tk). We assume that yik is ℱtk-measurable at the mesh points tk.
Suppose that the following condition is satisfied:
A numerical method is said to be mean-square stable (MS stable), if there exists an h0>0, such that any application of the method to (1) generates numerical approximations yik, which satisfy
(17)limk→∞𝔼|yik|2=0,i=1,2,…,n,
for all h∈(0,h0) with h=τj/mj.
Now we analyze the stability of EM numerical solution.
Theorem 5.
Under conditions (H1)–(H3) and (H5)-(H6), the Euler method applied to (1) is MS stable with h∈(0,h0) and h0=min1≤i≤n{1,hi}, where(18)hi=min{1ci,2ci-2∑j=1n|aij|αj-2∑j=1n|bij|βj-∑j=1n(μj+νj)(ci-∑j=1n|aij|αj-∑j=1n|bij|βj)2}.
Proof.
From (16), we have
(19)yik+1=(1-cih)yik+h∑j=1naijfj(yjk)+h∑j=1nbijgj(yjk-mj)+∑j=1nσij(k,yjk,yjk-mj)ΔWjk.
Squaring both sides of the previous equality, we obtain
(20)(yik+1)2=(1-cih)2(yik)2+h2[∑j=1naijfj(yjk)]2+h2[∑j=1nbijgj(yjk-mj)]2+[∑j=1nσij(k,yjk,yjk-mj)ΔWjk]2+2h(1-cih)yik[∑j=1naijfj(yjk)]+2h(1-cih)yik[∑j=1nbijgj(yjk-mj)]+2(1-cih)yik[∑j=1nσij(k,yjk,yjk-mj)ΔWjk]+2h∑j=1naijfj(yjk)∑j=1nσij(k,yjk,yjk-mj)ΔWjk+2h2∑j=1naijfj(yjk)∑j=1nbijgj(yjk-mj)+2h∑j=1nbijgj(yjk-mj)∑j=1nσij(k,yjk,yjk-mj)ΔWjk.
Noting that 𝔼(ΔWik)=0, 𝔼(ΔWikΔWjk)=0(i≠j),𝔼(ΔWik)2=h, and fj(yjk),gj(yjk-mj), and σij(k,yjk,yjk-mj), where j=1,2,…,n, are ℱtk-measurable; hence
(21)𝔼[∑j=1nσij(k,yjk,yjk-mj)ΔWjk]2=𝔼[∑j=1nσij2(k,yjk,yjk-mj)𝔼(ΔWjk)2∣ℱtk]2=h∑j=1n[𝔼σij2(k,yjk,yjk-mj)],𝔼[σij(k,yjk,yjk-mj)ΔWjk]=𝔼[σij(k,yjk,yjk-mj)𝔼(ΔWjk∣ℱtk)]=0,𝔼[fj(yjk)σij(k,yjk,yjk-mj)ΔWjk]=𝔼[fj(yjk)σij(k,yjk,yjk-mj)𝔼(ΔWjk∣ℱtk)]=0,𝔼[gj(yjk-mj)σij(k,yjk,yjk-mj)ΔWjk]=𝔼[gj(yjk-mj)σij(k,yjk,yjk-mj)𝔼(ΔWjk∣ℱtk)]=0.
Let Yik=𝔼(yik)2. Applying the inequalities 2abxy≤|ab|(x2+y2) and conditions (H2) and (H3), we obtain from (21) and (20)
(22)Yik+1≤(1-cih)2Yik+h2∑j=1n|aij|αj∑r=1n|air|αrYjk+h2∑j=1n|bij|βj∑r=1n|bir|βrYjk-mj+h∑j=1n(μjYjk+νjYjk-mj)+h∑j=1n|(1-cih)aij|αj(Yik+Yjk)+h∑j=1n|(1-cih)bij|βj(Yik+Yjk-mj)+h2∑j=1n|aij|αj∑j=1n|bij|βj(Yjk+Yjk-mj).
Thus
(23)Yik+1≤P(h)Yik+∑j=1nQj(h)Yjk+∑j=1nRj(h)Yjk-mj,
where
(24)P(h)=(1-cih)2+h∑j=1n|(1-cih)aij|αj+h∑j=1n|(1-cih)bij|βj,Qj(h)=h2|aij|αj∑r=1n|air|αr+hμj+h|(1-cih)aij|αj+h2|aij|αj∑j=1n|bij|βj,Rj(h)=h2|bij|βj∑r=1n|bir|βr+hνj+h|(1-cih)bij|βj+h2|aij|αj∑j=1n|bij|βj.
Then
(25)Yik+1≤(P(h)+∑j=1nQj(h)+∑j=1nRj(h))×max1≤j≤n{Yik,Yjk,Yjk-mj}.
By the recursion we conclude that Yik→0(k→∞) if
(26)P(h)+∑j=1nQj(h)+∑j=1nRj(h)<1,
which is equivalent to
(27)[ci2+(∑j=1n|aij|αj)2+(∑j=1n|bij|βj)2+2∑j=1n|aij|αj∑j=1n|bij|βj]h2+[-2ci+∑j=1n(μj+νj)+2∑j=1n|(1-cih)aij|αj+2∑j=1n|(1-cih)bij|βj]h<0.
If 0<h<1/ci, (27) reduces to
(28)(ci-∑j=1n|aij|αj-∑j=1n|bij|βj)2h<2ci-2∑j=1n|aij|αj-2∑j=1n|bij|βj-∑j=1n(μj+νj).
By conditions (H5) and (H6), we know that hi>0. Thus, (28) holds for h∈(0,hi). Let h0=min1≤i≤n{1,hi}; then limk→∞𝔼(yik)2=0. That is, the EM method of (1) is MS stable. This proof is completed.
Theorem 6.
For i=1,2,…,n, and k=1,2,…, there exists a positive constant C1 such that
(29)𝔼|xi(kh)-yik|2≤C1h,
where C1 depends on ci,aij,bij, and so on but not upon h. yik is defined in (16).
The proof is similar to Theorem 7 in [20].
4. Stability of SSBE Numerical Solution
In this section, we will construct the SSBE scheme to (1) and analyze the stability of the numerical solution. The adaptation of SSBE method to (1) leads to a numerical process of the following type:
(30)y-ik=yik+[-ciy-ik+∑j=1naijfj(yjk)+∑j=1nbijgj(yjk-mj+1)]h,yik+1=y-ik+∑j=1nσij(k,yjk,yjk-mj)ΔWjk.
The notations are same to the definition in (16). Now we present another main results of this paper.
Theorem 7.
Assume that (H1)–(H3) and (H5)-(H6) hold. Define
(31)𝒜i=ci2∑j=1n(μj+νj),ℬi=(∑j=1n|aij|αj+∑j=1n|bij|βj)2+2ci∑j=1n(μj+νj)-ci2,𝒞i=2(∑j=1n|aij|αj+∑j=1n|bij|βj-ci)+∑j=1n(μj+νj).
Then the SSBE method applied to (1) is MS stable with h∈(0,h0) and h0=min1≤i≤n{1,hi}, where hi=min1≤i≤n{(-ℬi+ℬi2-4𝒜i𝒞i)/2𝒜i}.
Proof.
From (30), we have
(32)(1+cih)y-ik=yik+h∑j=1naijfj(yjk)+h∑j=1nbijgj(yjk-mj+1).
Squaring both sides of (32), we obtain
(33)(1+cih)2(y-ik)2=(yik)2+h2[∑j=1naijfj(yjk)]2+h2[∑j=1nbijgj(yjk-mj+1)]2+2hyik∑j=1naijfj(yjk)+2hyik∑j=1nbijgj(yjk-mj+1)+2h2∑j=1naijfj(yjk)∑j=1nbijgj(yjk-mj+1).
It follows from inequality 2abxy≤|ab|(x2+y2) and (H2) that
(34)(1+cih)2(y-ik)2≤(yik)2+h2∑j=1n|aij|αj∑r=1n|air|αr(yjk)2+h2∑j=1n|bij|βj∑r=1n|bir|βr(yjk-mj+1)2+h∑j=1n|aij|αj[(yik)2+(yjk)2]+h∑j=1n|bij|βj[(yik)2+(yjk-mj+1)2]+h2∑j=1n|aij|αj∑j=1n|bij|βj[(yjk)2+(yjk-mj+1)2].
Letting Yik=𝔼(yik)2, we have
(35)(1+cih)2Y-ik≤Yik+h2∑j=1n|aij|αj∑r=1n|air|αrYjk+h2∑j=1n|bij|βj∑r=1n|bir|βrYjk-mj+1+h∑j=1n|aij|αj(Yik+Yjk)+h∑j=1n|bij|βj(Yik+Yjk-mj+1)+h2∑j=1n|aij|αj∑j=1n|bij|βj(Yjk+Yjk-mj+1).
On the other hand, from (30), we obtain
(36)(yik+1)2=(y-ik)2+[∑j=1nσij(k,yjk,yjk-mj)ΔWjk]2+2y-ik∑j=1nσij(k,yjk,yjk-mj)ΔWjk.
Noting that 𝔼(ΔWik)=0, 𝔼(ΔWikΔWjk)=0(i≠j), 𝔼(ΔWik)2=h. From (36) and (H3), we have
(37)Yik+1≤Y-ik+h∑j=1n(μjYjk+νjYjk-mj).
Substituting (35) into (37), we obtain
(38)Yik+1≤P(h)Yik+∑j=1nQj(h)Yjk+∑j=1nRj(h)Yjk-mj+∑j=1nSj(h)Yjk-mj+1,
where
(39)P(h)=1(1+cih)2(1+h∑j=1n|aij|αj+h∑j=1n|bij|βj),Qj(h)=1(1+cih)2(h2|aij|αj∑r=1n|air|αr+h|aij|αj+h2|aij|αj∑j=1n|bij|βj)+hμj,Rj(h)=hνj,Sj(h)=1(1+cih)2(h2|bij|βj∑r=1n|bir|βr+h|bij|βj+h2|aij|αj∑j=1n|bij|βj).
Then
(40)Yik+1≤(P(h)+∑j=1nQj(h)+∑j=1nRj(h)+∑j=1nSj(h))×max1≤j≤n{Yik,Yjk,Yjk-mj,Yjk-mj+1}.
By the recursion we conclude that Yik→0(k→∞) if
(41)P(h)+∑j=1nQj(h)+∑j=1nRj(h)+∑j=1nSj(h)<1,
which is equivalent to 𝒜ih2+ℬih+𝒞i<0, where
(42)𝒜i=ci2∑j=1n(μj+νj),ℬi=(∑j=1n|aij|αj+∑j=1n|bij|βj)2+2ci∑j=1n(μj+νj)-ci2,𝒞i=2(∑j=1n|aij|αj+∑j=1n|bij|βj-ci)+∑j=1n(μj+νj).
Since 𝒜i>0,𝒞i<0, by (H3), (H5), and (H6), we have ℬi2-4𝒜i𝒞i>0. This implies that
(43)hi=min1≤i≤n{-ℬi+ℬi2-4𝒜i𝒞i2𝒜i}>0.
Thus, (41) holds for h∈(0,hi). Let h0=min1≤i≤n{1,hi}; then limk→∞𝔼(yik)2=0. That is, the SSBE method of (1) is MS stable. The proof of the theorem is completed.
Theorem 8.
For i=1,2,…,n, and k=1,2,…, there exists a positive constant C2 such that
(44)𝔼|xi(kh)-yik|2≤C2h,
where C2 depends on ci,aij,bij, and so on but not upon h. yik is defined in (30).
The proof is similar to Theorem 3.2 in [26].
5. Example
In this section, we will discuss an example to illustrate our theory and compare the restrictions on stepsize of the stable SSBE method with that of the EM method.
Example 1.
Let W(t) be a two-dimensional Brown motion. Consider the following stochastic delay recurrent neural networks:
(45)d(x1(t)x2(t))=-C(x1(t)x2(t))dt+A(f(x1(t))f(x2(t)))dt+B(g(x1(t-1))g(x2(t-2)))dt+σ(x1(t)x1(t-1)x2(t)x2(t-2))dW(t).
Let f(x)=sinx, g(x)=arctanx,
(46)C=(10007),A=(20.40.61),B=(4-0.30.12),σ=(1002).
It is obvious that αi=βi=1, i=1,2, μ1=ν1=1, and μ2=ν2=2. So (H1)–(H3) are satisfied. By computation,
(47)-2ci+∑j=1n|aij|αj+∑j=1n|bij|βj+∑j=1n|aji|αi+∑j=1n|bji|βi+μi+νi={-4.6,i=1,-2.6,i=2,-2ci+∑j=1n|aij|αj+∑j=1n|bij|βj+∑j=1n|aji|αi+∑j=1n|bji|βi+∑j=1n(μj+νj)=-0.6,i=1,2,∑j=1n|aij|αj+∑j=1n|bij|βj=∑j=1n|aji|αi+∑j=1n|bji|βi={6.7,i=1,3.7,i=2.
Therefore conditions (H4)–(H6) also hold. By Theorem 2, system (45) is exponentially stable in mean square. The EM scheme and the SSBE scheme to (45) are also MS stable by Theorems 5 and 7.
Now, we can conclude that the EM method and the SSBE method to (45) are MS stable with h=0.125 from Figure 1. It verifies the validity of Theorems 5 and 7. The EM method is not stable, and the SSBE method is MS stable with h=0.1925 from Figure 2, which shows that the stability of the SSBE method is more superior to EM. Figure 3 illustrates that the SSBE method is unstable with h=0.25.
MS stability of the numerical solutions to (45) with h=0.125; (a) EM, (b) SSBE.
Instability of EM numerical solutions and MS stability of SSBE numerical solutions to (45) with h=0.1925; (a) EM, (b) SSBE.
Instability of SSBE numerical solutions of system (45) with h=0.25.
6. Conclusions
The model of stochastic neural network can be viewed as a special kind of stochastic differential equation; the solution is hard to be explicitly expressed. It not only has the characteristics of the general stochastic differential equations but also has its own features; its stability is connected with the activation functions and the connection weight matrixes. So it is necessary to discuss the stability of stochastic neural network. Different from the previous works on exponential stability of stochastic neural networks, both Lyapunov function method and two numerical methods are used to study the stability of stochastic delay recurrent neural networks. Under the conditions which guarantee the stability of the analytical solution, the EM method and the SSBE method are proved to be MS stable if the step size meets a certain limit. We can analyze other numerical methods for different types of stochastic delay neural networks in future.
Acknowledgments
This work was supported by the National Natural Science Foundation of China (nos. 60904032, 61273126), the Natural Science Foundation of Guangdong Province (no. 10251064101000008), and the Fundamental Research Funds for the Central Universities (no. 2012ZM0059).
MarcusC. M.WesterveltR. M.Stability of analog neural networks with delay198939134735910.1103/PhysRevA.39.347MR978323WuJ.20016Berlin, GermanyWalter de Gruyter10.1515/9783110879971MR1834537LiX.HuangL.WuJ.Further results on the stability of delayed cellular neural networks20035091239124210.1109/TCSI.2003.813982MR2006659LiX. M.HuangL. H.ZhuH.Global stability of cellular neural networks with constant and variable delays2003533-431933310.1016/S0362-546X(02)00176-1MR1964329ZBL1011.92006ArikS.Stability analysis of delayed neural networks20004771089109210.1109/81.855465MR1773948ZBL0992.93080ArikS.Global asymptotic stability analysis of bidirectional associative memory neural networks with time delays2005163580586LiaoT.WangF.Global stability for cellular neural networks with time delay200011614811484LiaoX.LiuQ.ZhangW.Delay-dependent asymptotic stability for neural networks with distributed delays2006751178119210.1016/j.nonrwa.2005.11.001MR2260907ZBL1194.34140KuangS.DengF.LiX.Stability and hopf bifurcation of a BAM neural network with delayed self-feedback6063Proceedings of the 7th International Symposium on Neural Networks2010493503Lecture Notes in Computer ScienceFaydasicokO.ArikS.Further analysis of global robust stability of neural networks with multiple time delays2012349381382510.1016/j.jfranklin.2011.11.007MR2899311HaykinS.1994Prentice-Hall, NJ, USABlytheS.MaoX.LiaoX.Stability of stochastic delay neural networks2001338448149510.1016/S0016-0032(01)00016-3MR1833972ZBL0991.93120WanL.SunJ.Mean square exponential stability of stochastic delayed Hopfield neural networks20053434306318ZhouQ.WanL.Exponential stability of stochastic delayed Hopfield neural networks20081991848910.1016/j.amc.2007.09.025MR2415802ZBL1144.34389ZhangY.YueD.TianE.Robust delay-distribution-dependent stability of discrete-time stochastic neural neural networks with time-varying delay200972412651273WangZ.LiuY.LiM.LiuX.Stability analysis for stochastic Cohen-Grossberg neural networks with mixed time delay2006173814820SunY.CaoJ.pth moment exponential stability of stochastic recurrent neural networks with time-varying delays2007841171118510.1016/j.nonrwa.2006.06.009MR2331433ZBL1196.60125HuangC.HeY.WangH.Mean square exponential stability of stochastic recurrent neural networks with time-varying delays20085671773177810.1016/j.camwa.2008.04.004MR2445323ZBL1152.60346KüchlerU.PlatenE.Strong discrete time approximation of stochastic differential equations with time delay2000541–318925010.1016/S0378-4754(00)00224-XMR1800113BuckwarE.Introduction to the numerical analysis of stochastic delay differential equations20001251-229730710.1016/S0377-0427(00)00475-1MR1803198ZBL0971.65004HighamD. J.KloedenP. E.Convergence and stability of implicit methods for jump-diffusion systems200632125140MR2237621ZBL1109.65007MaoX.SabanisS.Numerical solutions of stochastic differential delay equations under local Lipschitz condition2003151121522710.1016/S0377-0427(02)00750-1MR1950237ZBL1015.65002LiuM.CaoW.FanZ.Convergence and stability of the semi-implicit Euler method for a linear stochastic differential delay equation2004170225526810.1016/j.cam.2004.01.040MR2075010ZBL1059.65006HuP.HuangC.Stability of stochastic θ-methods for stochastic delay integro-differential equations20118871417142910.1080/00207160.2010.509430MR2787901RathinasamyA.BalachandranK.T-stability of the split-step θ-methods for linear stochastic delay integro-differential equations20115463964610.1016/j.nahs.2011.05.003MR2831467ZhangH.GanS.HuL.The split-step backward Euler method for linear stochastic delay differential equations2009225255856810.1016/j.cam.2008.08.032MR2494724ZBL1183.65007SongM.YuH.Numerical solutions of stochastic differential delay equations with Poisson random measure under the generalized Khasminskii-type conditions2012201224127397MR2965471ZBL1246.65019LiQ.GanS.Stability of analytical and numerical solutions for nonlinear stochastic delay differential equations with jumps201220121383108210.1155/2012/831082MR2889088ZBL1236.60055DingX.WuK.LiuM.Convergence and stability of the semi-implicit Euler method for linear stochastic delay integro-differential equations2006831075376110.1080/00207160601073680MR2285116ZBL1115.65007LiR.PangW.LeungP.Exponential stability of numerical solutions to stochastic delay Hopfield neural networks2010734–6920926JiangF.ShenY.Stability in the numerical simulation of stochastic delayed Hopfield neural networks2013227-814931498RathinasamyA.The split-step θ-methods for stochastic delay Hopfield neural networks20123683477348510.1016/j.apm.2011.10.020MR2920907MaoX.20082ndChichester, UKHorwoodMR2380366