^{1, 2}

^{1}

^{1}

^{1}

^{1}

^{2}

Exponential stability in mean square of stochastic delay recurrent neural networks is investigated in detail. By using Itô’s formula and inequality techniques, the sufficient conditions to guarantee the exponential stability in mean square of an equilibrium are given. Under the conditions which guarantee the stability of the analytical solution, the Euler-Maruyama scheme and the split-step backward Euler scheme are proved to be mean-square stable. At last, an example is given to demonstrate our results.

It is well known that neural networks have wide range of applications in many fields, such as signal processing, pattern recognition, associative memory, and optimization problems. Stability is one of the main properties of neural networks, which is preconditions in the designs and applications of neural networks. Time delays are unavoidable in neural networks systems, which is frequently the important source of poor performance or instability. Thus, stability analysis of neural networks with various delays has been extensively investigated; see [

In real nervous systems, the synaptic transmission is a noisy process brought on by random fluctuations from the release of neurotransmitters and other probabilistic causes [

The remainder of the paper is comprised of four sections. Some notations and the conditions of stability to the analytical solution are given in Section

Throughout the paper, unless otherwise specified, we will employ the following notations. Let

Consider the stochastic delay recurrent neural networks of the form

Model (

To obtain our results, we impose the following standing hypotheses.

Both

It follows from [

The trivial solution of system (

Using Itô’s formula and nonnegative semimartingale convergence theorem, [

If (

For

Then (

By (H4), there exists a sufficiently small positive constant

Set

Notice that

Therefore, we have

Notice that

If (

For

Then (

The

Let

Suppose that the following condition is satisfied:

A numerical method is said to be mean-square stable (MS stable), if there exists an

Now we analyze the stability of EM numerical solution.

Under conditions (H1)–(H3) and (H5)-(H6), the Euler method applied to (

From (

Squaring both sides of the previous equality, we obtain

Noting that

Let

Thus

Then

By the recursion we conclude that

If

By conditions (H5) and (H6), we know that

For

The proof is similar to Theorem 7 in [

In this section, we will construct the SSBE scheme to (

The notations are same to the definition in (

Assume that (H1)–(H3) and (H5)-(H6) hold. Define

Then the SSBE method applied to (

From (

Squaring both sides of (

It follows from inequality

Letting

On the other hand, from (

Noting that

Substituting (

Then

By the recursion we conclude that

Since

Thus, (

For

The proof is similar to Theorem 3.2 in [

In this section, we will discuss an example to illustrate our theory and compare the restrictions on stepsize of the stable SSBE method with that of the EM method.

Let

Let

It is obvious that

Now, we can conclude that the EM method and the SSBE method to (

MS stability of the numerical solutions to (

Instability of EM numerical solutions and MS stability of SSBE numerical solutions to (

Instability of SSBE numerical solutions of system (

The model of stochastic neural network can be viewed as a special kind of stochastic differential equation; the solution is hard to be explicitly expressed. It not only has the characteristics of the general stochastic differential equations but also has its own features; its stability is connected with the activation functions and the connection weight matrixes. So it is necessary to discuss the stability of stochastic neural network. Different from the previous works on exponential stability of stochastic neural networks, both Lyapunov function method and two numerical methods are used to study the stability of stochastic delay recurrent neural networks. Under the conditions which guarantee the stability of the analytical solution, the EM method and the SSBE method are proved to be MS stable if the step size meets a certain limit. We can analyze other numerical methods for different types of stochastic delay neural networks in future.

This work was supported by the National Natural Science Foundation of China (nos. 60904032, 61273126), the Natural Science Foundation of Guangdong Province (no. 10251064101000008), and the Fundamental Research Funds for the Central Universities (no. 2012ZM0059).