Stability Analysis of Stochastic Reaction-Diffusion Cohen-Grossberg Neural Networks with Time-Varying Delays

This paper is concerned with pth moment exponential stability of stochastic reaction-diffusion Cohen-Grossberg neural networks with time-varying delays. With the help of Lyapunov method, stochastic analysis, and inequality techniques, a set of new suffcient conditions on pth moment exponential stability for the considered system is presented. The proposed results generalized and improved some earlier publications.


Introduction
Since the seminal work for Cohen-Grossberg neural networks by Cohen and Grossberg 1 , theoretical understanding of neural network dynamics has advanced greatly.The model can be described by a system of ordinary differential equations ẋi t where t ≥ 0, n ≥ 2; n corresponds to the number of units in a neural network; x i t denotes the potential or voltage of cell i at time t; f j • denotes a nonlinear output function between cell i and j; α i • > 0 represents an amplification function; β i • represents an appropriately behaved function; the n × n connection matrix A a ij n×n denotes the strengths of connectivity between cells, and if the output from neuron j excites resp., inhibits neuron i, then a ij ≥ 0 resp., a ij ≤ 0 .During hardware implementation, time delays do exist due to finite switching speed of the amplifiers and communication time and, thus, delays should be incorporated into the model equations of the network.For model 1.1 , Ye et al. 2 introduced delays by considering the following delay differential equations: Some other more detailed justifications for introducing delays into model equations of neural networks can be found in 3-13 and references therein.It is seen that 1.2 is quite general and it includes several well-known neural networks models as its special cases such as Hopfield neural networks, cellular neural networks, and bidirectional association memory neural networks see, e.g., 14-18 .In addition to the delay effects, stochastic effects constitute another source of disturbances or uncertainties in real systems 19 .A lot of dynamical systems have variable structures subject to stochastic abrupt changes, which may result from abrupt phenomena such as stochastic failures and repairs of the components, changes in the interconnections of subsystems, and sudden environment changes.In the recent years, the stability investigation of stochastic Neural Networks is interesting to many investigators, and a large number of stability criteria of these systems have been reported 20-30 .The stochastic model can be described by a system of stochastic differential equations

1.3
However, besides delay and stochastic effects, diffusion effect cannot be avoided in the neural networks when electrons are moving in asymmetric electromagnetic fields 31 , so we must consider the activations vary in space as well as in time.In 32-36 , authors have considered the stability of reaction-diffusion neural networks with constant or time-varying delays, which are expressed by partial differential equations.To the best of our knowledge, few authors have considered the problem of pth moment stability for stochastic Cohen-Grossberg neural networks with both time-varying delays and reactiondiffusion terms.Motivated by the above discussions, in this paper, we consider the stochastic reaction-diffusion Cohen-Grossberg neural networks with time-varying delays described by the following stochastic partial differential equations: where

1.5
In the above model, n ≥ 2 corresponds to the number of units in the neural network, x x 1 , . . ., x m T is space variable, and y i x, t denotes the state variable of cell i at time t in space variable x; smooth function D ik D ik x, y, t ≥ 0 is a diffusion operator; Ω is a compact set with smooth boundary ∂Ω and the measure mes Ω > 0 in R m , ∂y i ∂n ∂Ω 0, 1.6 and φ i x, s are the boundary value and initial value, respectively; a ij t and b ij t denote the strengths of connectivity between cell i and j at time t, respectively; τ j t is time delay and satisfies 0 ≤ τ j t ≤ τ; σ σ ij t, y i x, t , y j x, t − τ j t n×n is the diffusion coefficient matrix, and ω t ω 1 t , . . ., ω n t T is an n-dimensional Brownian motion defined on a complete probability space Ω, F, P with a natural filtration {F t } t≥0 by standard Brownian motion {w s : 0 ≤ s ≤ t}.As a standing hypothesis, we assume that g j • and σ t, •, • satisfy the Lipschitz condition and the linear growth condition and that 1.4 has a solution on t ≥ 0 for the initial conditions.
The remainder of this paper is organized as follows.In Section 2, the basic notations and assumptions are introduced.In Section 3, criteria are proposed to determine pth moment exponential stability for the stochastic Cohen-Grossberg neural networks with timevarying delays and reaction-diffusion term.An illustrative example is given to illustrate the effectiveness of the obtained results in Section 4. We also conclude this paper in Section 5.

Preliminaries
For any y x, t y 1 x, t , . . ., y n x, t T ∈ R n , we define

2.1
As usual, we will also assume that the following conditions are satisfied.
H 1 There exist positive constants α i , α i , such that H 2 For each i ∈ {1, . . ., n}, there exists positive constant G i , such that

2.3
H 3 There exist positive functions γ j t , such that y j x, t β j y j x, t ≥ γ j t y 2 j x, t .

2.4
H 3 There exists positive constant γ j , such that y j x, t β j y j x, t ≥ γ j y 2 j x, t .

2.5
H 4 There are nonnegative functions c 0 ij t , c 1 ij t , for all t, u, v ∈ R, such that

2.6
H 4 There are nonnegative constants c 0 ij , c 1 ij , for all t, u, v ∈ R, such that The following inequality holds: Definition 2.1.The trivial solution of 1.4 is said to be pth moment exponentially stable if there is a pair of positive constants λ and G such that E y φ, t p < GE φ p e −λ t−t 0 , on t ≥ t 0 2.9 for any φ, where λ also called as convergence rate.When p 2, it is usually said to be exponentially stable in mean square.

Definition 2.2. Let h : R → R be a continuous function, d h; the upper right Dini-derivative of h is defined as
The following lemmas are important in our approach.

2.13
Lemma 2.4 generalized Halanay inequality 37 .For two positive-valued functions a t and b t defined on t 0 ∞ , assume there exists a constant number 0 ≤ μ < 1 satisfying 0 < a 0 ≤ a t , 0 < b t ≤ μa t hold for all t ≥ t 0 ; y t is nonnegative continuous function on t 0 − τ, ∞ and satisfies the following inequality: where y t sup t−τ≤s≤t y s ; τ ≥ 0 is constant.Then one has where λ * > 0 is defined as 2.16

Main Results
Theorem 3.1.Under assumptions where a ij s g j y j x, s x, s σ ij s, y i x, s , y j x, s − τ j s dxdω j s x, s y 2 j x, s − τ j s dx ds.

3.4
From the boundary condition, we get x, s y j x, s x, s y 2 j x, s − τ j s p p − 1 2 y j x, s − τ j s p .

3.9
It follows from 3.4 , 3.5 , 3.7 , 3.8 , and 3.9 that q i y i x, s p−1 σ ij s, y i x, s , y j x, s − τ j s dx dω j s 0.

3.11
Therefore, taking expectation on both sides of 3.10 , the preceding result leads directly to EV t δ, y t δ − EV t, y t

3.12
By the mean value theorem for integrals, we have Therefore, the trivial solution of system 1.4 is pth moment exponentially stable.Furthermore, just as discussed in 39, pp.173-180 , the trivial solution of 1.4 is also almost surely exponentially stable.Theorem 3.1 also shows that the reaction-diffusion term has no influence on the stability for system 1.4 .
In Theorem 3.1, if we take where

3.19
then for all ξ ∈ L p F 0 −τ, 0 , R n , the trivial solution of system 1.4 is pth moment exponentially stable, where p ≥ 2 is a constant.
When a ij t ≡ a ij , b ij t ≡ b ij , model 1.4 is reduced to the following stochastic reaction-diffusion Cohen-Grossberg neural networks with time-varying delays: where , the trivial solution of system 3.20 is pth moment exponentially stable, where p ≥ 2 is a constant. where

3.25
then for all ξ ∈ L p F 0 −τ, 0 , R n , the trivial solution of system 3.20 is pth moment exponentially stable.Remark 3.6.When D ik 0 i 1, n, k i, . . ., m system 3.20 is reduced to the stochastic Cohen-Grossberg neural networks 1.3 , which has been studied in 24, 28 .Unfortunately, the assumed condition A 2 in 28 is not correct, a defect appearing in the main result in 28 when p 2k 1, k ∈ Z , x t < 0, just from the constructed Lyapunov function, one can find that the term "x p/2 t " is a blemish.The constructed Lyapunov function should be replaced with V t, x t n i 1 q i |x i t | p .Noticing that ∂|x i t | p /∂x i p|x i | p−1 sgn {x i } p|x i | p−2 x i , we have ∂|x i t | p /∂x i β i x i t p|x i | p−2 x i β i x i t , so the assumed condition A 2 should be revised as H 3 .On the other hand, there is an error appear in 1.3 , the coefficient of λ 1 , the term " p − 1 p − 2 " should be replaced with p − 1 p − 2 /2, therefore, the main results obtained in 28 are somewhat errors.Obviously, Theorem 3.1 in our paper modifies and generalizes the main results in 28 greatly.Just choosing p 2, one can get a set of corollary easily, which also generalizes the main results in 24 .
Remark 3.7.When D ik 0 i 1, n, k i, . . ., m , σ ij t, •, • 0, system 1.4 is reduced to a deterministic Cohen-Grossberg neural networks with time-varying delays model, just choosing some special parameters, using Theorem 3.1, one can get a set of corollary easily, which also generalizes some corresponding results in 4 .

An Illustrative Example
In this section, a numerical example is presented to illustrate the correctness of our main result.
Example 4.1.Consider a two-dimensional stochastic reaction-diffusion Cohen-Grossberg neural networks with time-varying delays as follows: 4.2 In the example, let p 4, D ik be a positive constant, take c 0 ij c 1 ij q i ≡ 1, by simple computation, we get According to Corollary 3.2, one can get that  delays are constants, the delay functions appear in 29 are differential and their derivatives are simultaneously required to be not greater than 1, the activation functions appear in 22, 26 are bounded.Obviously, we have dropped out these basic assumptions in this paper.
It is obvious that the results in 20-30 and the references therein cannot be applicable to system 4.1 even if we remove the reaction-diffusion terms from the system for the connection matrix and delays considered in this example are time-varying.This implies that the results of this paper are essentially new.Just choose x ≡ constant these conclusions can be verified by the numerical simulations shown in Figure 1.

Conclusions
In this paper, stochastic Cohen-Grossberg neural networks with time-varying delays and reaction-diffusion have been investigated.All features of stochastic systems especially the connection matrices and delays are time-varying reaction-diffusion systems have been taken into account in the neural networks.Without requiring the differential and monotonicity of the activation functions and the symmetry of the connection matrices, a set of new sufficient conditions for checking pth moment exponential stability of the trivial solution of the considered system is presented by using of Lyapunov function, stochastic analysis technique, and the generalized Halanay inequality.The proposed results generalized and improved some of the earlier published results greatly.The results obtained in this paper are independent of the magnitude of delays and diffusion effect, which implies that strong selfregulation is dominant in the networks.In addition, the methods used in this paper are also applicable to other neural networks, such as stochastic Hopfield neural networks with timevarying delays and reaction-diffusion terms and stochastic bidirectional associative memory BAM neural networks with time-varying delays and reaction-diffusion terms.If we remove the noise and reaction-diffusion terms from the system, the derived conditions for stability of general deterministic neural networks can be viewed as byproducts of our results.

Remark 3 . 5 .
Model 3.20 has been studied in 40 and the main results in 40, Theorem 1, Corollaries 1 and 2 are the direct results of Corollary 3.4 in our paper when we choose p 2.
the trivial solution of system 1.4 is pth moment exponentially stable, where p ≥ 2 is a constant.
we have the following result.Under assumptions H 1 , H 2 , H 3 , H 4 , H 5 , if there exist a positive diagonal matrix , y i x, t , y j x, t − τ j t dω j t , Under assumptions H 1 , H 2 , H 3 , H 4 , H 5 , if there exist a positive diagonal matrix we have the following result.Under assumptions H 1 , H 2 , H 3 , H 4 , H 5 , if there exists a positive diagonal matrix One can find that models considered in 20, 22-29 are special cases of model 1.4 .To the best of our knowledge, few authors have considered the pth moment exponential stability for Stochastic reaction-diffusion Neural Networks with time-varying connection matrix and delays.It is assumed in22, 23, 25, 26 that