Robustness Analysis of Hybrid Stochastic Neural Networks with Neutral Terms and Time-Varying Delays

We analyze the robustness of global exponential stability of hybrid stochastic neural networks subject to neutral terms and timevarying delays simultaneously. Given globally exponentially stable hybrid stochastic neural networks, we characterize the upper bounds of contraction coefficients of neutral terms and time-varying delays by using the transcendental equation. Moreover, we prove theoretically that, for any globally exponentially stable hybrid stochastic neural networks, if additive neutral terms and timevarying delays are smaller than the upper bounds arrived, then the perturbed neural networks are guaranteed to also be globally exponentially stable. Finally, a numerical simulation example is given to illustrate the presented criteria.


Introduction
In the past decades, neural networks have been extensively used in various areas.Stability is usually a prerequisite for most successful applications of neural networks.The stability of neural networks depends mainly on their parametric configuration.Moreover, it is well known that noise and time delays are often the sources of instability and they may destroy the stability of neural networks if they exceed their limits.
Robustness is the traditional characteristic of disturbance rejection, and it refers to control system in the feature or parameter perturbation when it still can make the performance of the quality indicators remain unchanged.Robustness characterized control system is not sensitive to feature or parameter perturbation.In practical problems, the system characteristics or parameter perturbation is often unavoidable.Causes of perturbation mainly have two aspects; one is due to the imprecision of the measurement features or the fact that parameters will deviate from the actual value of the design value (nominal value); the other is influenced by environmental factors in the process of system operation and causes the slow drift of features or parameters.Robustness, therefore, has become an important research topic in the control theory.The study of robustness is mainly limited to linear time-invariant control system; the areas covered include stability and astatic and adaptive control.Robustness problems and relative stability of the control system and the principle of invariance have the close relation; the establishment of the internal model principle is the study of the fact that robustness problem plays an important role.
The robustness analyses of global exponential stability of various neural networks have been widely investigated in recent years.For example, in [1], the robustness of global exponential stability for hybrid neural networks with noise and delay perturbations was investigated.In [2], the robustness of global exponential stability of recurrent neural networks with time delays and random disturbances was discussed.In [3], delay-dependent robust stability of cellular neural networks with discrete and distributed time-varying delays was considered.Shen and Wang [4] studied robustness of global exponential stability of recurrent neural networks in the presence of time delays and random disturbances.The stability of the systems often also depends on neutral terms.In [5], Shen and Wang continue to analyse robustness of global exponential stability of nonlinear systems with time delays and neutral terms.
In implementation or application of neural networks, the change of parameters of neural networks abruptly is not uncommon.In [6], Liberzon characterizes the parameters of neural network (e.g., connection weights and biases) and reports that they change abruptly due to unexpected failure or designed switching.In such a case, neural networks can be represented by a switching model which can be regarded as a set of parametric configurations switching from one to another according to a given Markov chain.For instance, stability of stochastic neural networks with Markovian jumping Parameters was analyzed in [7].In [8], stability analysis for discrete time Markovian jump neural networks with mixed time delays was studied.Stability of several hybrid stochastic neural networks was analyzed in [9][10][11][12][13].In [14], the global asymptotic stability of cellular neural networks with delays was investigated.In [15,16], the authors investigated the robustness of global exponential stability of stochastic systems (with Markovian switching) in the presence of time-varying delays or noises.In [17], the robust stability analysis of switched Hopfield neural networks with time-varying delay under uncertainty was investigated.In [18,19], the delay-dependent robust stability of uncertain neutral-type stochastic systems with Markovian jumping was studied by using linear matrix inequality.
In this paper, we characterize robustness of the hybrid stochastic neural networks in the presence of neutral terms and time-varying delays.The upper bounds of neutral terms contraction coefficients and time-varying delays are estimated by solving transcendental equations.Moreover, we prove theoretically that, for any globally exponentially stable hybrid stochastic neural networks, if additive contraction coefficients of neutral terms and time-varying delays are smaller than the derived upper bounds herein, then the perturbed neural networks are guaranteed to be globally exponentially stable.
The remainder of this paper is organized as follows.In Section 2, some preliminaries and necessary notations are given.Section 3 discusses the stability conditions of the neural networks with Markovian switching in the presence of neutral terms and time-varying delays simultaneously.In Section 4, a numerical example is given to substantiate the theoretical results.Finally, concluding remarks are made in Section 5.
Let (),  ≥ 0, be a right continuous Markov chain on the probability space taking values in a finite state space  = {1, 2, . . ., } with generator Γ = (  ) × given by where  > 0. Here,   ≥ 0 is the transition rate from  to  if  ̸ = , while   = − ∑  ̸ =   .It is known that almost every sample path of () is a right continuous step function with a finite number of simple jumps in any finite subinterval of  + .
From the above definitions, it is clear that the almost sure global exponential stability of system (2) implies the th moment global exponential stability of system (2) (see [20]) but not vice versa.However, if Assumptions 1-3 hold, we can get the following lemma (see [20]).

Lemma 6. Let Assumptions 1-3 hold. The 𝑝th moment global exponential stability of system (2) implies the almost surely global exponential stability.
Meanwhile, in order to obtain our result, we also need another lemma (see [20]).
In particular, for  = 2, there is equality.
and τ is a unique positive solution of the transcendental equation: where

Numerical Example
In this section, we give an example with numerical simulation to illustrate our result in the preceding section.
In the presence of neutral terms and time-varying delays, the system with Markovian switching becomes where () is the time-varying delay and  ∈ (0, 1). Figure 1 depicts the Markov chain with the generator Γ.

Conclusion
In this paper, the robust stability of hybrid stochastic neural networks subject to neutral terms and time-varying delays is analyzed.In order to maintain global exponential stability of neural networks, the upper bounds of contraction coefficients of neutral terms and time-varying delays are derived.The results herein are feasible and provide the design and application of neural network with a theoretical basis.Some future directions may be aimed at the improvements of the upper bounds of contraction coefficients of neutral terms and timevarying delays.We will continue to study the robust stability using the Lyapunov theory or linear matrix inequality to improve stability condition.