Stochastic Dynamics of Nonautonomous Cohen-Grossberg Neural Networks

and Applied Analysis 3 see 35 . When designing an associative memory neural network, we should make convergence speed as high as possible to ensure the quick convergence of the network operation. Therefore, pth moment p ≥ 2 exponential stability and almost sure exponential stability are most useful concepts as they imply that the solutions will tend to the trivial solution exponentially fast. This motivates us to study pth moment exponential stability, and almost sure exponential stability for System 1.1 in this paper. The remainder of this paper is organized as follows. In Section 2, the basic assumptions and preliminaries are introduced. After establishing the criteria for the pth moment p ≥ 2 exponential stability and almost sure exponential stability for System 1.1 by using the Lyapunov functionmethod, Burkholder-Davids-Gundy inequality and Borel-Cantell’s theory in Section 3, an illustrative example and its simulations are given in Section 4. 2. Preliminaries Throughout this article, we let Ω,F, {Ft}t≥0, P be a complete probability space with a filtration {Ft}t≥0 satisfying the usual conditions i.e., it is right continuous and F0 contains all P -null sets . Let C C −∞, 0 , R be the Banach space of continuous functions which map into R with the topology of uniform convergence. For any x t x1 t , . . . , xn t T ∈ R, we define ‖x t ‖ ‖x t ‖p ∑n i 1 |xi t |p , 1 ≤ p < ∞ . The initial conditions for system 1.1 are x s φ s ,−τ ≤ s ≤ 0, φ ∈ LpF0 −τ, 0 , R ; here LpF0 −τ, 0 , R is R-valued stochastic process φ s ,−τ ≤ s ≤ 0, φ s is F0-measurable, ∫0 −τ E|φ s |pds < ∞. For the sake of convenience, throughout this paper, we assure fj 0 gj 0 σij 0 0, which implies that system 1.1 admits an equilibrium solution x t ≡ 0. If V ∈ C2,1 −τ,∞ × R;R , according to the Itô formula, define an operator LV associated with 1.2 as LV t, x Vt Vx{−H x t C x t −A t F x t − B t G xτ t } 1 2 trace [ σVxxσ ] , 2.1 where Vt ∂V t, x /∂t, Vx ∂V t, x /∂x1, . . . , ∂V t, x /∂xn , and Vxx ∂2V t, x / ∂xi∂xj n×n. To establish the main results of the model given in 1.1 , some of the standing assumptions are formulated as follows: H1 there exist positive constants hi, hi, such that 0 < hi ≤ hi x ≤ hi < ∞, ∀x ∈ R, i 1, 2, . . . , n; 2.2 H2 for each i 1, 2, . . . , n, there exist positive functions αi t > 0, such that xi t ci xi t ≥ αi t x2 i t ; 2.3 4 Abstract and Applied Analysis H3 there exist positive constants βj , γj , i, j 1, 2, . . . , n, such that ∣fj u − fj v ∣∣ ≤ βj |u − v|, ∣gj u − gj v ∣∣ ≤ γj |u − v|; 2.4 H4 each σij x satisfies the Lipschitz condition, and there exist positive constants μi, i 1, 2, . . . , n, such that trace { σ x σ x } ≤ n ∑ i 1 μix 2 i . 2.5 Remark 2.1. The activation functions are typically assumed to be continuous, differentiable, and monotonically increasing, such as the functions of sigmoid type. These restrictive conditions are no longer needed in this paper. Instead, only the Lipschitz condition is imposed in Assumption H3 . Note that the type of activation functions in H3 have already been used in numerous papers, see 5, 10 and references therein. Remark 2.2. We remark here that non-autonomous conditions H2 – H4 replace the usual autonomous conditions which is more useful for practical purpose; please refer to 4, 13 and references therein. Remark 2.3. The delay functions τj t considered in this paper only needed to be bounded; they can be time-varying, nondifferentiable functions. This generalized some recently published results in 4, 13, 26–29 . Different from the models considered in 4, 13, 29 , in this paper, we have removed the following condition: H0 For each j 1, 2, . . . , n, τj t is a differentiable function, namely, there exists ξ such that τ̇j t ≤ ξ < 1. 2.6 Definition 2.4 see 35 . The trivial solution of 1.1 is said to be pth moment exponential stability if there is a pair of positive constants λ and C such that E‖x t, t0, x0 ‖p ≤ C‖x0‖pe−λ t−t0 , on t ≥ t0, ∀x0 ∈ R, 2.7 where p ≥ 2 is a constant; when p 2, it is usually said to be exponential stability in mean square. Definition 2.5 see 35 . The trivial solution of 1.1 is said to be almost sure exponential stability if for almost all sample paths of the solution x t , we have lim sup t→∞ 1 t log‖x t ‖ < 0. 2.8 Abstract and Applied Analysis 5 Lemma 2.6 35 Burkholder-Davids-Gundy inequality . There exists a universal constant Kp for any 0 < p < ∞ such that for every continuous local martingale M vanishing at zero and any stopping time η,and Applied Analysis 5 Lemma 2.6 35 Burkholder-Davids-Gundy inequality . There exists a universal constant Kp for any 0 < p < ∞ such that for every continuous local martingale M vanishing at zero and any stopping time η, E ( sup 0≤s≤η |Ms| ) ≤ KpE ( 〈M,M〉η )p/2 , 2.9 where 〈M,M〉η is the cross-variation ofM. In particular, one may haveKp 32/p p/2 if 0 < P < 2 and K2 4 if p 2; although they may not be optimal, for example, one could haveK1 4 √ 2. Lemma 2.7 35 Chebyshev’s inequality . P{ω : |X ω | ≥ c} ≤ c−pE|X|p 2.10 if c > 0, p > 0, X ∈ L. Lemma 2.8 36 Borel-Cantell’s lemma . Let {An, n ≥ 1} be a sequence of events in some probability space, then i if ∑∞ n 1 P An < ∞, then P An, i.o. 0; ii moreover, if {An, n ≥ 1} are independent of each other, then ∑∞ n 1 P An ∞ implies P An, i.o. 1, 2.11 where {An, i.o.} denotes occurring infinitely often within {An, n ≥ 1}, that is, {An, i.o.} ∩∞ k 1∪n kAn. “i.o.” is the abbreviation of “infinitely often”. 3. Main Results Theorem 3.1. Under the assumptions (H1)–(H4), if there are a positive diagonal matrix M diag m1, . . . , mn and two constants 0 < N2, 0 ≤ μ < 1, such that 0 < N2 ≤ N2 t ≤ μN1 t , for t ≥ t0, 3.1


Introduction
For decades, the studies of neural networks have attracted considerable multidisciplinary research interest.Ranging from signal processing, pattern recognition, programming problems, and static image processing, neural networks have witnessed a large amount of successful applications in many fields 1-7 .These applications rely crucially on the analysis of the dynamical behavior of the models 8-16 .Most existing literature on theoretical studies of neural networks is predominantly concerned with deterministic differential equations.
Recently, studies have been intensively focused on stochastic models 17-24 ; it has been realized that the synaptic transmission is a noisy process brought on by random fluctuations from the release of neurotransmitters and other probabilistic causes, and it is of great significance to consider stochastic effects on the stability of neural networks described by stochastic functional differential equations, see 25-34 .In 17 , Liao and Mao studied mean square exponential stability and instability of cellular neural networks CNNs .In 18, 26 , the authors continued their research to discuss almost sure exponential stability for a class of stochastic neural networks with discrete delays by using the nonnegative semimartingale convergence theorem.In 25 , exponential stability of stochastic Cohen-Grossberg neural networks CGNNs with time-varying delays via Razumikihin-type technique were investigated.In 19 , Wan and Sun investigated mean square exponential stability of stochastic delayed Hopfield neural networks HNNs by using the method of variation of constants.Also with the help of the method of variation of constants, Sun and Cao in 29 investigated pth moment exponential stability of stochastic recurrent neural networks with time-varying delays.
However, to the best of our knowledge, few authors have considered the problem of pth moment exponential stability and almost sure exponential stability of stochastic nonautonomous Cohen-Grossberg neural networks.In fact, in the process of the electronic circuits' applications, assuring constant connection matrix and delays are unrealistic.Therefore, in this sense, time-varying connection matrix and delays will be better candidates for modeling neural information processing.
Motivated by the above discussions, in this paper, we consider the stochastic Cohen-Grossberg Neural Networks SCGNN with time-varying connection matrix and delays described by the following non-autonomous stochastic functional differential equations: σ ij • n×n is the diffusion coefficient matrix and ω t ω 1 t , . . ., ω n t T is an n-dimensional Brownian motion defined on a complete probability space Ω, F, P with a natural filtration {F t } t≥0 i.e., F t σ{w s : 0 ≤ s ≤ t} .
Obviously, model 1.1 or 1.2 is quite general, and it includes several well-known neural networks models as its special cases such as Hopfield neural networks, cellular neural networks, and bidirectional association memory neural networks 10, 16, 27, 28 .There are at least three different types of stochastic stability to describe limiting behaviors of stochastic differential equations: stability in probability, moment stability and almost sure stability see 35 .When designing an associative memory neural network, we should make convergence speed as high as possible to ensure the quick convergence of the network operation.Therefore, pth moment p ≥ 2 exponential stability and almost sure exponential stability are most useful concepts as they imply that the solutions will tend to the trivial solution exponentially fast.This motivates us to study pth moment exponential stability, and almost sure exponential stability for System 1.1 in this paper.
The remainder of this paper is organized as follows.In Section 2, the basic assumptions and preliminaries are introduced.After establishing the criteria for the pth moment p ≥ 2 exponential stability and almost sure exponential stability for System 1.1 by using the Lyapunov function method, Burkholder-Davids-Gundy inequality and Borel-Cantell's theory in Section 3, an illustrative example and its simulations are given in Section 4.

Preliminaries
Throughout this article, we let Ω, F, {F t } t≥0 , P be a complete probability space with a filtration {F t } t≥0 satisfying the usual conditions i.e., it is right continuous and F 0 contains all P -null sets .Let C C −∞, 0 , R n be the Banach space of continuous functions which map into R n with the topology of uniform convergence.For any x t x 1 t , . . ., For the sake of convenience, throughout this paper, we assure f j 0 g j 0 σ ij 0 0, which implies that system 1.1 admits an equilibrium solution x t ≡ 0. If V ∈ C 2,1 −τ, ∞ × R n ; R , according to the It o formula, define an operator LV associated with 1.2 as where V t ∂V t, x /∂t, V x ∂V t, x /∂x 1 , . . ., ∂V t, x /∂x n , and To establish the main results of the model given in 1.1 , some of the standing assumptions are formulated as follows: H 2 for each i 1, 2, . . ., n, there exist positive functions α i t > 0, such that H 3 there exist positive constants β j , γ j , i, j 1, 2, . . ., n, such that where p ≥ 2 is a constant; when p 2, it is usually said to be exponential stability in mean square.
where M, M η is the cross-variation of M. In particular, one may have K p 32/p p/2 if 0 < P < 2 and K 2 4 if p 2; although they may not be optimal, for example, one could have Lemma 2.7 35 Chebyshev's inequality .
Lemma 2.8 36 Borel-Cantell's lemma .Let {A n , n ≥ 1} be a sequence of events in some probability space, then where {A n , i.o.} denotes occurring infinitely often within is the abbreviation of "infinitely often".

Main Results
Theorem 3.1.Under the assumptions (H 1 )-(H 4 ), if there are a positive diagonal matrix M diag m 1 , . . ., m n and two constants 0 where then the trivial solution of system 1.1 is pth moment exponential stability, where p ≥ 2 denotes a positive constant.When p 2, the trivial solution of system 1.1 is exponential stability in mean square.
Proof.Consider the following Lyapunov function: As p ≥ 2 denotes a positive constant, we can get the following inequality: if a and b denote nonnegative real numbers, then pa p−1 b ≤ p − 1 a p b p , a p−2 b 2 ≤ p − 2 a p /p 2b p /p.Using this inequality, then the operator associated with system 1.1 has the form as follows:

3.5
The remaining part of the proof is similar to that of Theorem 3.3 in 33 ; we omit it.
In Theorem 3.1, if we let M be the identity matrix, we can easily obtain the following corollary.

Corollary 3.2. Under the assumptions (H 1 )-(H 4 ), if there are two constants
where

3.7
then the trivial solution of system 1. 3.9 Calculating the integral of 3.9 from N to t, we have

3.12
From Theorem 3.1, there exists a pair of positive constants λ and δ, such that E x t p ≤ δ x 0 p e −λ t−t 0 , on t ≥ t 0 .

3.13
Furthermore, from H 1 -H 4 and inequality 3.13 , we have Abstract and Applied Analysis

3.15
For any two different norms • 2 , • p−2 , 1 < p < ∞ , for all x ∈ R n , as the space R n is a finite dimensional space, there exist two positive constants ζ 1 , ζ 2 , such that

3.16
As s dω j s is continuous local martingale, then from Lemma 2.6, H 4 , and 3.16 , it follows that 3.17 According to 3.11 , 3.13 , 3.14 , and 3.17 , we have the following inequality: holds for all but finitely many N. Hence, there exists an N 0 N 0 ω , for all ω ∈ Ω excluding a P -null set, for which 3.21 holds whenever N ≥ N 0 .Consequently, for almost all ω ∈ Ω, Therefore, the trivial solution of 1.1 is almost sure exponential stability.
Remark 3.5.Compared with 26, 32 , our method does not resort to the semimartingale convergence theorem.Since system 1.1 does not require the delays to be constants, furthermore, the model is non-autonomous, it is clear that the results obtained in 19, 25-32, 34 cannot be applicable to system 1.1 .This implies that the results of this paper are essentially new and complement some corresponding ones already known.
Remark 3.6.By Theorems 3.1 and 3.4, the stability of system 1.1 is dependent on the magnitude of noise, and therefore, stochastic noise fluctuation is one of the very important aspects in designing a stable network and should to be considered adequately.
It should be noted that the assumptions of the boundedness of a ij t , b ij t , and α i t in Theorem 3.4 are not necessary; we use these assumptions just to simplify the process of the proof.In fact, in view of 3.15 , 3.20 , 3.21 , and 3.22 , similar to the proof of Theorem 3.4, we have the following theorem.Theorem 3.7.Suppose system 1.1 satisfies assumptions (H 1 )-(H 4 ) and the inequality 3.1 hold, if there exist positive constants ρ ij , ρ ij , ϑ ij such that for any t, we have then the trivial solution of 1.1 is almost sure exponential stability.
Remark 3.8.Furthermore, the derived conditions for stability of the following stochastic delayed recurrent neural networks can be viewed as byproducts of our results.The significant of this paper does offer a wider selection on the networks parameters in order to achieve some necessary convergence in practice.
Remark 3.9.For system 1.1 , when h i x i t 1, c i x i t c i x i t c i > 0 , and a ij t a ij , b ij t b ij , then it turns out to be following stochastic delayed recurrent neural networks with time-varying delays σ ij x j t dω j t .

3.26
Using Theorems 3.1 and 3.4, one can easily get a set of similar corollary for checking the pth moment exponential stability and almost sure exponential stability for the trivial solution of this system.

An Illustrative Example
In this section, an example is presented to demonstrate the correctness and effectiveness of the main obtained results.In the example, let p 3; by simple computation, we obtain we can find that 29, Theorem 1 is not satisfied; therefore, they fail to conclude whether system 4.1 is pth moment exponentially stable even when the delay functions are differential and their derivatives are simultaneously required to be not greater than 1.It is obvious that the results in 19, 25-32, 34 and the references therein cannot be applicable to system 4.1 .
denotes the state variable associated with the ith neuron at time t; h i • represent an amplification function; c i • is an appropriately behaved function; f j • and g j • are activation functions; A t a ij t n×n and B t b ij t n×n represents the strength of the neuron interconnection within the network; τ j t corresponds to the time delay required in processing, 0 ≤ τ j t ≤ τ; σ • dω j t , i 1, . . ., n, 1.1 or dx t −H x t C x t − A t F x t − B t G x τ t dt σ x t dω t , 1.2 where x t x 1 t , x 2 t , . . ., x n t T , H x t diag h 1 x 1 t , h 2 x 2 t , . . ., h n x n t , A t a ij t n×n , B t b ij t n×n , G x τ t g 1 x 1 t − τ 1 t , . . ., g n x n t − τ n t T , F x t f 1 x 1 t , . . ., f n x n t T , and σ x t σ ij x j t n×n .Here x i t − v|; 2.4 H 4 each σ ij x satisfies the Lipschitz condition, and there exist positive constants μ i , i 1, 2, . . ., n, such that trace σ T x σ x ≤ Remark 2.1.The activation functions are typically assumed to be continuous, differentiable, and monotonically increasing, such as the functions of sigmoid type.These restrictive conditions are no longer needed in this paper.Instead, only the Lipschitz condition is imposed in Assumption H 3 .Note that the type of activation functions in H 3 have already been used in numerous papers, see 5, 10 and references therein.Remark 2.2.We remark here that non-autonomous conditions H 2 -H 4 replace the usual autonomous conditions which is more useful for practical purpose; please refer to 4, 13 and references therein.Remark 2.3.The delay functions τ j t considered in this paper only needed to be bounded; they can be time-varying, nondifferentiable functions.This generalized some recently published results in 4, 13, 26-29 .Different from the models considered in 4, 13, 29 , in this paper, we have removed the following condition: H 0 For each j 1, 2, ..., n, τ j t is a differentiable function, namely, there exists ξ such that τj t ≤ ξ < 1.2.6Definition 2.4 see 35 .The trivial solution of 1.1 is said to be pth moment exponential stability if there is a pair of positive constants λ and C such that 6emma 2.635 Burkholder-Davids-Gundy inequality .There exists a universal constant K p for any 0 < p < ∞ such that for every continuous local martingale M vanishing at zero and any stopping time η, 1 is pth moment exponentially stability.
2 t T and τ i t is any bounded positive function for i 1, 2. Each σ ij x satisfies the Lipschitz condition, and there exist positive constants μ 1 μ 2 2, such that trace σ T x σ x ≤ 2 x 2 Choosing μ 8/9, one can easily get that0 < N 2 ≤ N 2 t ≤ μN 1 t , for t ≥ 0. 4.4Thus, it follows Theorem 3.7 that system 4.1 is the third moment exponentially stable and also almost sure exponentially stable.These conclusions can be verified by the following numerical simulations Figures 1, 2, 3, and 4 .
n j 1 h i p − 1 a ij t β j −