Stability in Switched Cohen-Grossberg Neural Networks with Mixed Time Delays and Non-Lipschitz Activation Functions

The stability for the switched Cohen-Grossberg neural networks with mixed time delays and αinverse Hölder activation functions is investigated under the switching rule with the average dwell time property. By applying multiple Lyapunov-Krasovskii functional approach and linear matrix inequality LMI technique, a delay-dependent sufficient criterion is achieved to ensure such switched neural networks to be globally exponentially stable in terms of LMIs, and the exponential decay estimation is explicitly developed for the states too. Two illustrative examples are given to demonstrate the validity of the theoretical results.


Introduction
In the past few decades, there has been increasing interest in different classes of neural networks such as Hopfield, Cellular, Cohen-Grossberg, and bidirectional associative neural networks due to their potential applications in many areas such as classification, signal and image processing, parallel computing, associate memories, optimization, and cryptography 1-5 .In the design of practical neural networks, the qualitative analysis of neural network dynamics plays an important role.To solve problems of optimization, neural control, and signal processing, neural networks have to be designed in such a way that, for a given external input, they exhibit only one globally asymptotically/exponentially stable equilibrium point.Hence, much effort has been made in the stability of neural networks, and a number of sufficient conditions have been proposed to guarantee the global asymptotic/exponential stability for neural networks with or without delays in recent years, see, for example, 6-26 and the references therein.
Recently, by combing the theories of the switched systems and neural networks, several classes of mathematics models of switched neural networks have been established.As a special class of switched systems, switched neural networks, whose individual subsystems are a set of neural networks, have found applications in fields of high speed signal processing, artificial intelligence, and gene selection in a DNA microarray analysis 27-30 .Stability issues of switched neural networks have received great attention of researchers so far 31-39 .In 31 , based on the Lyapunov-Krasovskii method and LMI approach, some sufficient conditions were derived for global robust exponential stability of a class of switched Hopfield neural networks with time-varying delay under uncertainty.In 32 , by combining Cohen-Grossberg neural networks with an arbitrary switching rule, the mathematical model of a class of switched Cohen-Grossberg neural networks with mixed time varying delays was established, and the robust stability for such switched Cohen-Grossberg neural networks was analyzed.In 33 , by employing nonlinear measure and LMI techniques, some new sufficient conditions were obtained to ensure global robust asymptotical stability and global robust exponential stability of the unique equilibrium for a class of switched recurrent neural networks with time-varying delay.In 34 , authors investigated a large class of switched recurrent neural networks with time-varying structured uncertainties and time-varying delay, some delay-dependent robust periodicity criteria guaranteeing the existence, uniqueness, and global asymptotic stability of periodic solution for all admissible parametric uncertainties were devised by employing free weighting matrices and LMIs.In 35 , a new class of switched interval neural networks with discrete and distributed time-varying delays of neural type was developed, and a delay-dependent sufficient criterion was also obtained in terms of LMIs which guarantee the global robust exponential stability for the proposed switched interval neural networks.It should be pointed out that results in 31-35 focused on the stability of switched neural networks under arbitrary switching rule by using common Lyapunov function method.However, common Lyapunov function method requires all the subsystems of the switched system to share a positive definite radially unbounded common Lyapunov function.Generally, this requirement is difficult to achieve.
In past few years, much attention has been paid to making use of the dwell time approach to deal with the analysis and synthesis of switched neural networks 36-39 .It should be pointed out that the average dwell time method is regarded as an important and attractive method to find a suitable switching signal to guarantee switched system stability or improve other performance and has been widely applied to investigate the analysis and synthesis for switched system with or without time-delay, see for example, 40-44 .Very recently, in 36 , based on multiple Lyapunov functions method and LMI techniques, the authors presented some sufficient conditions in terms of LMIs which guarantee the robust exponential stability for uncertain switched Cohen-Grossberg neural networks with interval time-varying delay and distributed time-varying delay under the switching rule with the average dwell time property.In 37 , by using the average dwell time method, the delaydependent sufficient conditions were derived towards the robust exponential stability for a class of discrete-time switched Hopfield neural networks with time delay.In 38 , by applying a new Lyapunov-Krasovskii functional and the average dwell time method, a delay-rangedependent exponential stability criteria and decay estimation are presented in terms of LMIs for switched Hopfield neural networks.
It should be noted that, all the results reported in 31-39 are concerned with switched neural networks with Lipschitz activation functions.To the best of our knowledge, very little attention has been paid to the problem of delay-dependent stability for switched neural networks without Lipschitz activation functions, which motivates the work of this paper.
In this paper, our aim is to study the delay-dependent exponential stability problem for a class of switched Cohen-Grossberg neural networks with mixed time delays and αinverse H ölder activation functions.Here, it should be pointed out that α-inverse H ölder activation functions are a class of non-Lipschitz functions.By applying Brouwer degree properties, LMI technique and constructing a novel Lyapunov-Krasovskii functional, the existence, uniqueness and global exponential stability of equilibrium point are proved for Cohen-Grossberg neural networks with mixed time delays and α-inverse H ölder activation functions.By means of the multiple Lyapunov-Krasovskii functional and the average dwell time approach, a delay-dependent sufficient condition in terms of LMIs is presented to ensure to the considered switched neural networks to be globally exponentially stable, and a explicit expression for the exponential decay estimation is also obtained for the states.Two illustrative examples are given to demonstrate the validity of the theoretical results.
The rest of this paper is organized as follows.In Section 2, the model formulation and some preliminaries are given.In Section 3, the existence, uniqueness, and global exponential stability of equilibrium point are proved for Cohen-Grossberg neural networks with mixed time delays and α-inverse H ölder activation functions.In Section 4, the global exponential stability criterion and state decay estimation are presented for the switched Cohen-Grossberg neural networks with mixed time delays and α-inverse H ölder activation functions.In Section 5, two numerical examples are presented to demonstrate the validity of the proposed results.Some conclusions are made in Section 6.
Notations.Throughout this paper, R denotes the set of real numbers, R n denotes the ndimensional Euclidean space, R m×n denotes the set of all m × n real matrices.For any matrix A, A T denotes the transpose of A. A −1 denotes the inverse of A. If A is a real symmetric matrix, A > 0 A < 0 means that A is positive definite negative definite .Given

Neural Network Model and Preliminaries
The Cohen-Grossberg neural networks with mixed time delays can be described by the following differential equation system where x t x 1 t , . . ., x n t T is the vector of neuron states at time t; α x t diag α 1 x 1 t , . . ., α n x n t represents the amplification function; β x t diag β 1 x 1 t , . . ., β n x n t is the behaved function; g x t g 1 x 1 t , . . ., g n x n t T is called the neuron activation function; W i , i 0, 1, 2, are the connection weight matrices; τ > 0 denotes the discrete and distributed time delay; J J 1 , . . ., J n T denotes the external input.The initial value associated with 2.1 is assumed to be x s ϕ s , and ϕ s is a continuous function on −τ, 0 .
In the following, some definitions and lemmas, which play important roles in the proof of our theorems below, are introduced.Definition 2.1.The equilibrium point x * of the neural networks 2.1 is said to be globally exponentially stable, if there exist scalars η > 0, T > 0, and δ > 0,such that where x t is the solution of the system 2.1 with the initial value x s ϕ s , and s ∈ −τ, 0 , δ is called the exponential convergence rate.

Definition 2.2 see 23, 24 . A continuous function
ii for any ρ ∈ R, there exist constants q ρ > 0 and r ρ > 0 which are correlated with ρ, satisfying where α > 0 is a constant.
The class of α-inverse H ölder functions is denoted by IH α .There are a great number of functions which belong to IH α .For example, G θ arctan θ, g θ where > 0 is a constant, then G is said to be a Lipschitz-continuous function.When α 1 and q ρ is independent on ρ, 1-inverse H ölder functions are called inverse Lipschitz functions.
It is easy to see that α-inverse H ölder functions are a class of non-Lipschitz functions.
Let function F : R n → R n be locally Lipschitz continuous.According to Rademacher's theorem 45 , F is differentiable almost everywhere.Let D F denote the set of those points where F is differentiable, then, Ḟ x is the Jacobian of F at x ∈ D F and the set D F is dense in R n .The generalized Jacobian of a locally Lipschitz function is defined as follows.
Definition 2.4.For any x ∈ R n , the generalized Jacobian ∂F x of a locally Lipschitz continuous function F : R n → R n is a set of matrices defined by ∂F x co W | there exists a sequence x k ⊂ D F with lim where co • denotes the convex hull of a set.
The generalized Jacobian is a natural generalization of the Jacobian for continuously differentiable functions, at those points x, where F is continuously differentiable, ∂F x reduces to a single matrix which is Jacobian Ḟ x of F. Definition 2.5.For any switching signal σ t and any finite constants holds for T a > 0, N 0 ≥ 0, then T a is said to be the average dwell time.
Lemma 2.7 see 23, 24 .If G θ ∈ IH α and G 0 0, then there exist constants q 0 > 0 and r 0 > 0, such that Moreover, Lemma 2.8 see 25 .Let F : R n → R n be locally Lipschitz continuous.For any given x, y ∈ R n , there exists an element W in the union ∪ z∈ x,y ∂F z such that where x, y denotes the segment connecting x and y.
Let Ω be a nonempty, bounded, and open subset of R n .The closure and boundary of Ω are denoted by Ω and ∂Ω, respectively.
Lemma 2.12 Jensen's inequality .For any constant matrix Ω ∈ R n×n , and Ω Ω T > 0, scalar γ > 0, vector function ω : 0, γ → R n , such that the integrations concerned are well defined, then To give our main results in the next sections, we need to present the following assumptions.
H 2 β i s is locally Lipschitz continuous, and there exists a constant β i > 0 such that βi s ≥ β i for all s ∈ R at which

Exponential Stability of the Cohen-Grossberg Neural Network
In this section, the Cohen-Grossberg neural network 2.1 is considered the main results on the existence and stability of equilibrium point of the neural network 2.1 will be presented in the following theorem.
Theorem 3.1.Under the assumptions H 1 −− H 3 , if there exist two positive definite matrices S, T , two positive definite diagonal matrices M, P and a scalar γ > 0 such that Proof.We should prove this theorem in three steps.
where Ω R {x ∈ R n : x ≤ R}.From 3.1 and the definition of negative definite matrix, we can obtain By Lemma 2.11, 3.4 is equivalent to By means of Lemma 2.10, we have

3.6
By using 3.5 and 3.6 , we have where MH 0 i i 1, 2, . . ., n denotes the ith element of vector MH 0 .By virtue of Lemma 2.7, there exist constants q i 0 > 0 and r i 0 > 0, i 1, 2, . . ., n, such that nr 0 } and x ∈ ∂Ω R , then there exist two index sets N and N, such that where N N {1, 2, . . ., n}.Furthermore, there exists an index i 0 in N such that By 3.8 and 3.10 , for any x ∈ ∂Ω R and λ ∈ 0, 1 ,

3.11
Hence, this implies that H λ, x / 0 for any x ∈ ∂Ω R and λ ∈ 0, 1 .By applying Lemma 2.9 1 , it follows , where | B| is the determinant of B. By Lemma 2.9 2 , H x 0 has at least one solution in Ω R , that is, the system 2.1 has at least an equilibrium point.
Step 2. In this step, the proof of the uniqueness of the equilibrium point by the method of contradiction will be given.
Assume that x * 1 and x * 2 are two different equilibrium points of the system 2.1 , then

3.15
This is a contradiction.Hence, x * 1 x * 2 .This shows that the equilibrium point of the system 2.1 is unique.
Step 3. In this step, we will prove that the system 2.1 is globally exponentially stable.
Let F x, t α x −β x W 0 g x W 1 g x τ W 2 t t−τ g x s ds J , where x τ t x t − τ .Since g i ∈ IH α , α x , β x are continuous functions, F x, t are continuous and locally bounded.Hence, we can obtain the existence of the local solution of the system 2.1 with initial value x t ϕ t , t ∈ −τ, 0 on 0, t * ϕ , where t * ϕ ∈ 0, ∞ or t * ϕ ∞, and 0, t * ϕ is the maximal right-side existence interval of the local solution.
Let x * be the unique equilibrium point of the system 2.1 .Make a transformation u t x t − x * , then system 2.

Discrete Dynamics in Nature and Society
Consider the following Lyapunov-Krasovskii functional candidate where e γs g u s T T g u s ds dθ.

3.19
Calculating the time derivative of V t along the trajectories of the system 3.16 on 0, t * ϕ with 3.17 , by the assumption H 1 and Lemma 2.12, we have

3.20
Let ξ t u t T , g u t T , g u t − τ T , t t−τ g u s ds T T .By 3.17 and 3.20 , we can obtain < 0, for any ξ T t / 0.

3.24
Thus there exists a scalar T > 0, when t ≥ T, u i t ∈ −r 0 , r 0 , where r 0 min 1≤i≤n r i 0 .Let m min 1≤i≤n m i , α max 1≤i≤n α i , and q 0 min 1≤i≤n q i 0 .From 3.22 and 3.23 , we have , t ≥ T.

3.25
That is, when t ≥ T, e γs g ϕ s − x * T T g ϕ s − x * ds dθ.

Stability of the Switched Cohen-Grossberg Neural Network
The switched Cohen-Grossberg neural networks with mixed time delays consist of a set of Cohen-Grossberg neural networks with mixed time delays and a switching rule.Each of the Cohen-Grossberg neural networks with mixed time delays is regarded as an individual subsystem.The operation mode of the switched neural networks is determined by the switching rule.In the following, we will develop the switched Cohen-Grossberg neural networks model with mixed time delays.Suppose that x * is the unique equilibrium point of the system 2.1 .Similar to 3.16 , make a transformation u t x t − x * , then system 2.1 is transformed into The switched Cohen-Grossberg neural networks with mixed time delays can be described as where σ t : 0, ∞ → Γ {1, 2, . . ., N} is the switching signal, which is a piecewise constant function of time.This means that the matrices W 0 σ t , W 1 σ t , W 2 σ t , J σ t are allowed to take values, at an arbitrary time, in the finite set The initial value associated with 4.3 is assumed to be u s ϕ s , ϕ s is a continuous function on −τ, 0 .In this paper, it is assumed that the switching rule σ is not known a priori and its instantaneous value is available in real time.Corresponding to the switching signal σ t , we have the switching sequence {x t 0 ; i 0 , t 0 , . . ., i k , t k , . . ., | i k ∈ Γ, k 0, 1, . ..}, which means that the i k th subsystem is activated when t ∈ t k , t k 1 .
In the following, we will consider the switched Cohen-Grossberg neural networks with mixed time delays in 4.2 .The average dwell time approach will be used to derive the exponential stability of the network.Theorem 4.1.Under the assumptions H 1 − − H 3 , if there exist positive definite matrices S i , T i , positive definite diagonal matrices M i , P i and scalars γ > 0, μ ≥ 1 such that where M i diag m i1 , m i2 , . . ., m in , P i diag p i1 , p i2 , . . ., p in , then the switched Cohen-Grossberg neural network 4.2 is globally exponentially stable for any switching signal with average dwell time satisfying Moreover, an estimate of the state decay for the system 4.2 is given by Proof.Consider the multiple Lyapunov-Krasovskii functional candidate 4.9 By 4.9 ; there exist constants T > t 0 , when t ≥ T, we have where m min j∈Γ,1≤i≤n m ji , α max 1≤i≤n α i and q 0 min 1≤i≤n q i 0 .Hence, max 1≤i≤n This implies that

4.12
Due to k ≤ t − t 0 /T a and T a > T * a 1 α ln μ /γ , then we can get

4.13
This implies that the switched Cohen-Grossberg neural network 4.2 is globally exponentially stable.The proof is completed.
Remark 4.2.It is clear that, according to Theorem 4.1, the delay-dependent stability of the considered neural networks is dependent on μ for given γ.If μ 1, we have from 4.5 that the switching signal can be arbitrary, and 4.4 reduces to P i ≤ P j , M i ≤ M j , S i ≤ S j , T i ≤ T j , for all i, j ∈ Γ, which implies P i P j , M i M j , S i S j , T i T j , for all i, j ∈ Γ, which means that it requires a common Lyapunov functional for all subsystems.If μ → ∞, we get from 4.5 that there is no switching, that is, switching signal will have a great dwelltime on the average.

Illustrative Examples
In this section, two illustrative examples will be given to check the validity of the results obtained in Theorems 3.

5.2
All assumptions of Theorem 3.1 hold.Hence, this neural network has one unique equilibrium point, which is globally exponentially stable.Figures 1 and 2 display the state trajectories of this neural network with initial values ϕ t cos t, sin t T , 0.5 cos t, −0.5 sin t T , −1.5 cos t, 1.5 sin t T , 2 cos t, 2 sin t T and −2.5 cos t, 2.5 sin t T , t ∈ −1, 0 .It can be seen that these trajectories converge to the unique equilibrium x * 0, 0 T of the network.This is in accordance with the conclusion of Theorem 3.1.For numerical simulation, assume that the two subsystems are switched every four seconds.Figures 3, 4, and 5 display the state trajectories of this neural network with initial values ϕ t −0.5 cos t, 0.5 sin t, 0.5 cos t T , cos t, − sin t, − cos t T , −1.5 cos t, 1.5 sin t, 1.5 cos t T , −2 cos t, 2 sin t, 2 sin t T and 2.5 cos t, −2.5 sin t, −2.5 cos t T , t ∈ −1, 0 .It can be seen that these trajectories converge to the unique equilibrium u * 0, 0, 0 T of the network.This is in accordance with the conclusion of Theorem 4.1.

Conclusion
In this paper, the existence, uniqueness, and global stability of the equilibrium point for Cohen-Grossberg neural networks with mixed time delays, α-inverse H ölder neuron activation functions, and nonsmooth behaved functions have been discussed.By applying multiple Lyapunov-Krasovskii functional, a delay-dependent global exponential stability criterion has been obtained in terms of LMIs for the switched Cohen-Grossberg neural networks with mixed time delays and α-inverse H ölder neuron activation functions under the switching rule with the average dwell time property.The results obtained are easily checked and applied in practice engineering.
When neuron activation functions are non-Lipschitz functions, it is possible that the neural network system has not the global solution and the equilibrium point.This leads to difficulty in solving the stability problem, particularly exponential stability for the switched neural networks with non-Lipschitz activation functions.In the future, the stability problem for the switched neural networks with other non-Lipschitz activation functions will be expected to be solved.

Figure 1 :
Figure 1: The convergence of the state x 1 t of the network in Example 5.1.

Figure 2 :
Figure 2: The convergence of the state x 2 t of the network in Example 5.1.

Figure 3 :
Figure 3: The convergence of the state x 1 t of the network in Example 5.2.

Figure 4 :
Figure 4: The convergence of the state x 2 t of the network in Example 5.2.

Figure 5 :
Figure 5: The convergence of the state x 3 t of the network in Example 5.2.

By using 4 . 5 ,
it follows that the average dwell time T * a 3.6461.All the assumptions of Theorem 4.1 hold.Hence, this switched neural network is globally exponentially stable.
Sg u t − e γt e −γτ g u t − τ T Sg u t − τ , V4 t ≤ τe γt g u t T T g u t − When t ∈ t k , t k 1 , the i k th subsystem is activated.Arguing as in the proof Theorem 3.1, we can get Vσ t t ≤ 0. Thus, V σ t t ≤ V σ t k t k .In the light of 4.4 and 4.7 , it follows that V σ t k t k ≤ μV σ t − Remark 4.3.In the existing literature 2, 3, 17, 22, 27-29, 34-38 , the activation functions of switched neural networks are required to be Lipschitz continuous.However, in this paper, the activation function is inverse H ölder function.It is obvious the results of this paper are different from the results in the above literatures.Hence, the work in this paper is an extension of the scope of the current investigation in this field.
Example 5.2.Consider a third-order switched Cohen-Grossberg neural networks with mixed time delays in 4.3 with the switching signal σ t : 0, ∞ → Γ {1, 2} and the following parameters: − cos θ, β i θ are the functions in Example 5.1, and the activation functions are taken as g i θ θ 3 θ, i 1, 2, 3, and τ 1.It is easy to check that assumptions H 1 − − H 3 hold; A and B is the third-order identity matrix.Take γ 0.1, μ 1.2.Solving the LMIs in 4.3 and 4.4 by using appropriate LMI solver in the Matlab, the feasible positive definite matrices P 1 , P 2 , M 1 , M 2 , S 1 , S 2 , T 1 , T 2 could be as