Almost Sure Stability of Stochastic Neural Networks with Time Delays in the Leakage Terms

1Department of Mathematics and Computer Science, Tongling University, Tongling 244000, China 2School of Mathematical Sciences and Institute of Finance and Statistics, Nanjing Normal University, Nanjing 210023, China 3Department of Mathematics, University of Bielefeld, 33615 Bielefeld, Germany 4School of Mathematics and Information Technology, Nanjing Xiaozhuang University, Nanjing, Jiangsu 211171, China


Introduction
During the past decades, a great deal of attention has been paid to investigate the dynamics behaviors such as stability, periodic oscillatory behavior, almost periodic oscillatory behavior, and chaos and bifurcation of neural networks.Particularly, the stability of neural networks is one of the best topics since many important applications depend heavily on the stability of the equilibrium point.Therefore, there have appeared a large number of works on the stability of the equilibrium point of various neural networks such as Hopfield neural networks, cellular neural networks, recurrent neural networks, Cohen-Grossberg neural networks, and bidirectional associative memory (BAM) neural networks [1][2][3][4][5][6][7][8][9].
As is well known, time delay is one of the most significant phenomena that occur in many different fields such as biology, chemistry, economy, and communication networks.Moreover, it is inevitably encountered in both neural processing and signal transmission due to the limited bandwidth of neurons and amplifiers.However, the existence of time delays may cause oscillation, divergence, chaos, instability, or other poor performance in neural networks, which are usually harmful to the applications of neural networks.Therefore, the stability analysis for neural networks with time delays has attracted many researchers' much attention in the literature.The existing works on the stability of neural networks with time delays can be simply classified into four categories: constant delays, time-varying delays, distributed delays, and mixed time delays.
It should be mentioned that a new class of delays, called leakage delays (also named time delays in the "forgetting" or leakage terms), was initially introduced by Gopalsamy [10] in the study of neural networks.In [10], Gopalsamy pointed out that the leakage delays often have a tendency to destabilize the neural networks and they were very difficult to handle.Hence, to investigate the stability of neural networks with leakage delays has been an interesting and challenging topic.It is inspiring that there have been many interesting results on the stability of neural networks with leakage delays reported in the literature [11][12][13][14][15][16][17].For example, Liu in [11] investigated the existence of a unique equilibrium and globally exponential stability for a class of BAM neural networks with timevarying delays in the leakage terms by using the fixed point theorem and Lyapunov functional theory.By using the Lyapunov-Krasovskii functional having triple integral terms and model transformation technique, Zhu et al. in [12] obtained some novel sufficient delay-dependent conditions to ensure the globally exponential stability in the mean square of impulsive bidirectional associative memory (BAM) neural networks with both Markovian jump parameters and leakage delays.In [13], Wang et al. discussed the stability of recurrent neural networks with time delays in the leakage terms under impulsive perturbations.By applying a new stability lemma, Itô's formula, Lyapunov-Krasovskii functional, stochastic analysis theory, and matrix inequalities technique, Xie et al. in [15] studied the exponential stability in the mean square for a class of stochastic neural networks with leakage delays and expectations in the coefficients.
On the other hand, noise disturbance is a major source of instability and poor performances in neural networks.Usually, many real nervous systems are affected by external perturbations which in many cases are of great uncertainty and hence may be treated as random.Just as Haykin pointed out, the synaptic transmission can be regarded as a noisy process introduced by random fluctuations from the release of neurotransmitters and other probabilistic causes.Therefore, we should consider the effect of noise disturbances when studying the stability of neural networks.Generally speaking, neural networks with noise disturbances are called stochastic neural networks.Recently, there have appeared a large number of results on the stability of stochastic neural networks (see, e.g., [3-5, 7, 9, 13, 14]).Unluckily, those criteria presented in [3-5, 7, 9, 13, 14] require a strict condition that the derivative of the considered Lyapunov-Krasovskii functional is negative; that is, L(, ()) < 0 for any () ̸ = 0, where L(, ()) is a weak infinitesimal operator and (, ()) is a positive Lyapunov-Krasovskii functional.However, L(, ()) may not be negative in many real cases, which leads to the fact that the criteria obtained in [3-5, 7, 9, 13, 14] fail in this case.
Motivated by the above discussion, in this paper we study the stability problem for a class of stochastic neural networks with time delays in the leakage terms.Different from the previous literature, we aim to remove the restriction of L(, ()) < 0 for any () ̸ = 0.By using the LaSalle invariant principle of stochastic delay differential equations, Itô's formula, and stochastic analysis theory, some novel sufficient conditions are derived to guarantee the almost sure stability of the equilibrium point.Moreover, two numerical examples and their simulations are provided to show the effectiveness of the theoretical results and demonstrate that time delays in the leakage terms do contribute to the stability of stochastic neural networks.
The remainder of this paper is organized as follows.In Section 2, we introduce the model of a class of stochastic neural networks with time delays in the leakage terms and present the definition of almost sure stability as well as some necessary assumptions.By means of the LaSalle invariant principle of stochastic delay differential equations, Itô's formula, and stochastic analysis theory, our main results are established in Section 3. In Section 4, two numerical examples are given to show the effectiveness of the obtained results.Finally, in Section 5, the paper is concluded with some general remarks.
Notation 1.The notations used in this paper are quite standard.R  and R × denote the -dimensional Euclidean space and the set of all  ×  real matrices, respectively.The superscript "" denotes the transpose of a matrix or vector, and the symbol "⋆" denotes the symmetric term of the matrix.Trace (⋅) denotes the trace of the corresponding matrix and  denotes the identity matrix with compatible dimensions.For any matrix ,  max () (resp.,  min ()) denotes the largest (resp., smallest) eigenvalue of .For square matrices  1 and  2 , the notation  1 > (≥, <, ≤)  2 denotes that  1 −  2 is positive definite (positive semidefinite, negative, and negative semidefinite) matrix.Let () = ( 1 , . . .,   )  be -dimensional Brownian motion defined on a complete probability space (Ω, F, ) with a natural filtration {F  } ≥0 .Also
is a Borel measurable function, and  > 0 denotes the leakage delay. 1 and  2 are constant delays.
Throughout this paper, the following assumptions are assumed to hold.
Now we give the concept of almost sure stability for system (1).
Definition 1.The equilibrium point of ( 1) is said to be almost surely stable if for every  ∈  2 where "a.s." denotes "almost surely." The following lemma is needed to prove our main results.

Main Results and Proofs
In this section, the almost sure stability of the equilibrium point for system (1) is investigated under Assumptions H1-H3.
Theorem 3.Under Assumptions H1-H3, the equilibrium point of ( 1) is almost surely stable, if there exist a positive scalar , positive diagonal matrices  1 ,  2 , and  3 , and positive definite matrices , , , , and  such that the following linear matrix inequalities (LMIs) hold: where Proof.Fixing  ∈  2 F 0 ([−, 0]; R  ) arbitrarily and writing (; ) = (), we first define an infinitesimal generator L of the Markov process acting on (, ()) as follows: ) denote the family of all nonnegative functions (, ) on R + × R  which are continuously twice differentiable in  and differentiable in .
Under Assumptions H1, H  2, and H  3, we have the following result.Theorem 5.Under Assumptions H1, H  2, and H  3, the equilibrium point of ( 24) is almost surely stable, if there exist a positive scalar , positive diagonal matrices  1 ,  2 , and  3 , and positive definite matrices , , , , and  such that the following linear matrix inequalities (LMIs) hold: where Proof.Consider the following Lyapunov-Krasovskii functional: Similar to the proof of Theorem 3, we can obtain the desired result by a direct computation.The proof of Theorem 5 is completed.
Remark 6. Theorems 3 and 5 present some novel sufficient conditions for a class of stochastic neural networks with or without time delays in the leakage terms to ascertain the almost sure stability of the equilibrium point by constructing a different Lyapunov-Krasovskii functional.These conditions are easy to be verified and can be applied in practice as they can be checked by using recently developed algorithms in solving LMIs.It is worth pointing out that Theorem 3 depends on all the delay constants ,  1 , and  2 , whereas Theorem 5 only depends on the delay constants  1 ,  2 .Therefore, Theorem 3 is less conservative than Theorem 5.

Illustrative Examples
In this section, two numerical examples are given to illustrate the effectiveness of the obtained results.
Example 1.Consider a two-dimensional stochastic neural network with time delays in the leakage terms: where () = ( 1 (),  2 ())  and () is a two-dimensional Brownian motion.Let Consider  1 = 0.8 and  2 = 0.9.Then system (31) satisfies Assumption H1 with  (34) By using the Matlab LMI toolbox, we can obtain the following feasible solution for LMIs (7) Therefore, it follows from Theorem 3 that network (31) is almost surely stable.By using the Euler-Maruyama numerical scheme, simulation results are as follows:  = 160 and step size  = 0.02. ) .
(37) Therefore, it follows from Theorem 3 that network (36) is almost surely stable.
By using the Euler-Maruyama numerical scheme, simulation results are as follows:  = 160 and step size  = 0.02. Figure 2 is the state response of network (36) with the initial condition [−0.6, 0.8]  , for −1.2 ≤  ≤ 0. Remark 8. Examples 1 and 2 show that two-dimensional stochastic neural networks with and without time delays in the leakage terms are both almost surely stable.However, we know from Figures 1 and 2 that the stability speed of stochastic neural network with time delays in the leakage terms is clearly faster than that of stochastic neural network without time delays in the leakage terms.This fact reveals that time delays in the leakage terms do contribute to the stability of stochastic neural networks.

Concluding Remarks
In this paper, we have investigated the almost sure stability analysis problem for a class of stochastic neural networks with time delays in the leakage terms.Some novel delay-dependent conditions are obtained to ensure that the suggested system is almost surely stable, which is quite different from the moment stability.Our method is mainly based on the LaSalle invariant principle of stochastic delay differential equations, Itô's formula, and stochastic analysis theory.Moreover, the stability criteria given in this paper are expressed in terms of LMIs, which can be solved easily by recently developed algorithms.In addition, we use two examples to show that time delays in the leakage terms do contribute to the stability of stochastic neural networks.Finally, we point out that it is possible to generalize our results to some more complex stochastic neural networks with time delays in the leakage terms (e.g., consider the effect of fractional-order factor [20]).Research on this topic is in progress.