A class of interval Cohen-Grossberg neural networks with time-varying delays and infinite distributed delays is investigated. By employing H-matrix and M-matrix theory, homeomorphism techniques, Lyapunov functional method, and linear matrix inequality approach, sufficient conditions are established for the existence, uniqueness, and global robust exponential stability of the equilibrium point and the periodic solution to the neural networks. Our results improve some previously published ones. Finally, numerical examples are given to illustrate the feasibility of the theoretical results and further to exhibit that there is a characteristic sequence of bifurcations leading to a chaotic dynamics, which implies that the system admits rich and complex dynamics.
1. Introduction
In the past two decades, neural networks have received a great deal of attention due to the extensive applications in many areas such as signal processing, associative memory, pattern recognition, and parallel computation and optimization. It should be pointed out that the successful applications heavily rely on the dynamic behaviors of neural networks. Stability, as one of the most important properties of neural networks, is crucially required when designing neural networks.
In electronic implementation of neural networks, there exist inevitably some uncertainties caused by the existence of modeling errors, external disturbance, and parameter fluctuation, which would lead to complex dynamic behaviors. Thus, it is important to investigate the robustness of neural networks against such uncertainties and deviations (see [1–8] and references therein). In [4–6], employing homeomorphism techniques, Lyapunov method, H-matrix and M-matrix theory, and linear matrix inequality (LMI) approach, Shao et al. established some sufficient conditions for the existence, uniqueness, and global robust exponential stability of the equilibrium point for the following interval Hopfield neural networks:
(1)u˙i(t)=-diui(t)+∑j=1naijfj(uj(t))+∑j=1nbijfj(uj(t-τj(t)))-Ji,i=1,2,…,n,
where τj(t) is time-varying delay which is variable with time due to the finite switching speed of amplifiers. Recently, the stability of neural networks with time-varying delays has been extensively investigated, and various sufficient conditions have been established for the global asymptotic and exponential stability in [9–13]. Generally, neural networks usually have a spatial extent due to the presence of a multitude of parallel pathways with a variety of axon sizes and lengths. It is desired to model them by introducing continuously distributed delays over a certain duration of time such that the distant past has less influence compared to the recent behavior of the state (see [14–16]). However, the distributed delays were not taken into account in system (1).
As an important neural networks, Cohen-Grossberg neural networks (CGNNs) include Hopfield neural networks, cellular neural networks, and other neural networks. CGNNs have aroused a tremendous surge of investigation in these years. Whereas, for the interval CGNNs, fewer robust stability results have been reported in contrast to the results on Hopfield neural networks [17–19]. On the other hand, the research of neural networks involves not only the dynamic analysis of equilibrium point but also that of the periodic oscillatory solution, which is very important in learning theory due to the fact that learning usually requires repetition [20, 21]. Some important results for periodic solutions of neural networks have been obtained in [7, 22–27] and references therein. Motivated by the works of [4–6] and the discussions above, the objective of this paper is to investigate the global robust exponential stability and periodic solutions of the following CGNNs with time-varying and distributed delays:
(2)u˙i(t)=-α~i(ui(t))×[β~i(ui(t))-∑j=1naijfj(uj(t))-∑j=1nbijfj(uj(t-τj(t)))-∑j=1ncij∫-∞tkj(t-s)fj(uj(s))ds+Ji(t)],i=1,2,…,n,
or equivalently
(3)u˙(t)=-α~(u(t))[∫-∞tβ~(u(t))-Af(u(t))-Bf(u(t-τ(t)))-C∫-∞tK(t-s)f(u(s))ds+J(t)],
where
(4)u(t)=(u1(t),…,un(t))T,α~(u(t))=diag(α~1(u1(t)),…,α~n(un(t))),β~(u(t))=(β~1(u1(t)),…,β~n(un(t)))T,f(u(t))=(f1(u1(t)),…,fn(un(t)))T,f(u(t-τ(t)))=(f1(u1(t-τ1(t))),…,fn(un(t-τn(t))))T,K(t)=diag(k1(t),…,kn(t)),A=(aij)n×n,B=(bij)n×n,C=(cij)n×n,J(t)=(J1(t),…,Jn(t))T,
where ui(t) denotes the state of the ith neuron at time t, α~i(ui(t)) denotes a positive, continuous, and bounded amplification function; that is, 0<α_i≤α~i(ui(t))≤α¯i<+∞, β~i(ui(t)) denotes an appropriate behaved function, fj(uj(t)) denotes the activation function, τj(t) denotes the time-varying delay associated with the jth neuron, satisfying 0≤τj(t)≤τ and 0≤τ˙j(t)≤δ<1, kj(t)>0 represents the delay kernel function, which is a real-valued continuous function, A is the connection weight matrix, B is the time-varying delayed connection weight matrix, C is the infinite distributed delayed connection weight matrix, Ji(t) is the external input bias. The coefficients aij, bij, and cij can be intervalised as follows:
(5)AI=[A_,A¯]={A=(aij)n×n:a_ij≤aij≤a¯ij,i,j=1,2,…,n},BI=[B_,B¯]={B=(bij)n×n:b_ij≤bij≤b¯ij,i,j=1,2,…,n},CI=[C_,C¯]={C=(cij)n×n:c_ij≤cij≤c¯ij,i,j=1,2,…,n},
where for X=A,B,C, X_=(x_ij)n×n, X¯=(x¯ij)n×n. Denote B*=(B¯+B_)/2 and B*=(B¯-B_)/2. Clearly, B* is a nonnegative matrix and the interval matrix [B_,B¯]=[B*-B*,B*+B*]. Consequently, B=B*+ΔB, ΔB∈[-B*,B*]. C* and C* are defined correspondingly.
Throughout this paper, we make the following assumptions.
For the behaved functions β~i(·)(i=1,2,…,n), there exist constants γi>0 such that
(6)β~i(x)-β~i(y)x-y≥γi>0,∀x,y∈ℝ,x≠y.
For the activation functions fi(·)(i=1,2,…,n), there exist constants li>0 such that
(7)0≤fi(x)-fi(y)x-y≤li,∀x,y∈ℝ,x≠y.
The delay kernels kj(·)(j=1,2,…,n) satisfy
(8)∫0∞kj(s)ds=1,∫0∞kj(s)eμsds<∞,
for some positive constant μ.
A typical example of such delay kernels is given by kj(s)=sr/r!γjr+1e-γjs for s∈[0,∞), where γj∈[0,∞), r∈{0,1,…,n}, which are called the Gamma Memory Filter in [28].
The organization of this paper is as follows. In Section 2, some preliminaries are given. In Section 3, sufficient conditions are presented for the existence, uniqueness, and global robust exponential stability of the equilibrium point for system (2) with the external constant input bias (i.e., Ji(t)≡Ji,Ji is a constant). In Section 4, sufficient conditions are given which guarantee the uniqueness and global exponential stability of periodic solutions for system (2) when the time-varying delay τi(t) and the external input bias Ji(t) are continuously periodic functions. Numerical examples are provided to illustrate the effectiveness of the obtained results in Section 5. A concluding remark is given in Section 6 to end this work.
2. Preliminaries
We give some preliminaries in this section. Denote Γ=diag(γ1,γ2,…,γn), L=diag(l1,l2,…,ln), α_=diag(α_1,α_2,…,α_n), and x^ij=max{|x_ij|,|x¯ij|}. For a vector x=(x1,x2,…,xn), ∥x∥r=(∑i=1n|xi|r)1/r and for any φ(s)=(φ1(s),…,φn(s)), s∈(-∞,0], ∥φ(s)∥r=sups∈(-∞,0](∑i=1n|φi(s)|r)1/r. For a matrix A=(aij)n×n, AT denotes the transpose; A-1 denotes the inverse; A>(≥)0 means that A is a symmetric positive definite (semidefinite) matrix; λmax(A) and λmin(A) denote the largest and the smallest eigenvalues of A, respectively; and ∥A∥2=λmax(ATA) denotes the spectral norm of A. I denotes the identity matrix. * denotes the symmetric block in a symmetric matrix.
Definition 1 (see [29]).
The neural network (2) with the parameter ranges defined by (5) is globally robustly exponentially stable, if for each A∈AI, B∈BI, C∈CI, and J, system (2) has a unique equilibrium point u*=(u1*,u2*,…,un*)T, and there exist constants a≥1 and ε>0 such that
(9)∥u(t)-u*∥≤a∥ϕ(θ)-u*∥e-εt,∀t>0,
where u(t)=(u1(t),u2(t),…,un(t))T is a solution of system (2) with the initial value ui(θ)=ϕi(θ), i=1,2,…,n, θ∈(-∞,0], and ϕ(θ)=(ϕ1(θ),ϕ2(θ),…,ϕn(θ)).
Definition 2 (see [30]).
Let Zn={A=(aij)n×n∈Mn(ℝ):aij≤0 if i≠j,i,j=1,2,…,n}, where Mn(ℝ) denotes the set of all n×n matrices with entries from ℝ. Then a matrix A is called an M-matrix if A∈Zn and all successive principal minors of A are positive.
Definition 3 (see [30]).
An n×n matrix A=(aij)n×n is said to be an H-matrix if its comparison matrix M(A)=(mij)n×n is an M-matrix, where mij={|aii|,ifi=j,-|aij|,ifi≠j.
Lemma 4 (see [31]).
For any vectors x,y∈ℝn and positive definite matrix G∈ℝn×n, the following inequality holds: 2xTy≤xTGx+yTG-1y.
Lemma 5 (see [30]).
Let A,B∈Zn. If A is an M-matrix and the elements of matrices A and B satisfy the inequalities aij≤bij, i,j=1,2,…,n, then B is an M-matrix.
Lemma 6 (see [30]).
The following LMI: (Q(x)S(x)ST(x)R(x))>0, where Q(x)=QT(x), R(x)=RT(x), is equivalent to R(x)>0 and Q(x)-S(x)R-1(x)ST(x)>0orQ(x)>0 and R(x)-ST(x)Q-1(x)S(x)>0.
Lemma 7 (see [32]).
Suppose that the neural network parameters are defined by (5), and
(10)Ξ=(Φ-S-PB*-PC**(1-δ)Q0**R)>0,
where P=diag(p1,p2,…,pn), Q=diag(q1,q2,…,qn), and R=diag(r1,r2,…,rn) are positive diagonal matrices, Φ=(Φij)n×n=2PΓL-1-((2-δ)/(1-δ))∥PB*∥2I-2∥PC*∥2I-Q-R, S=(sij)n×n with
(11)sij={2pia¯ii,ifi=j,max{|pia_ij+pja_ji|,|pia¯ij+pja¯ji|},ifi≠j.
Then, for all A∈AI, B∈BI, and C∈CI, we have
(12)Θ=(Φ-S′-PΔB-PΔC*(1-δ)Q0**R)>0,
where S′=(sij′)n×n=PA+ATP.
3. Global Robust Exponential Stability of the Equilibrium Point
In this section, in system (2), we assume that the external input bias Ji(t)≡Ji, Ji is a constant (i=1,2,…,n), and we will give a new sufficient condition for the existence and uniqueness of the equilibrium point for system (2) and analyze the global robust exponential stability of the equilibrium point.
Theorem 8.
Under assumptions (H1) and (H2), if there exist positive diagonal matrices P=diag(p1,p2,…,pn), Q=diag(q1,q2,…,qn), and R=diag(r1,r2,…,rn) such that Ξ>0, where Ξ is defined by (10), then system (2) has a unique equilibrium point.
The proof of Theorem 8 is similar to that in [32], therefore we omit it here.
Let u*=(u1*,u2*,…,un*)T be the equilibrium point of system (2). By coordinate transformation v(t)=u(t)-u*, one can transform system (2) into the following system
(13)v˙i(t)=-αi(vi(t))×[∑j=1ncij∫-∞tβi(vi(t))-∑j=1naijgj(vj(t))-∑j=1nbijgj(vj(t-τj(t)))-∑j=1ncij∫-∞tkj(t-s)gj(vj(s))ds],i=1,2,⋯,n,
or equivalently
(14)v˙(t)=-α(v(t))×[∫-∞tβ(v(t))-Ag(v(t))-Bg(v(t-τ(t)))-C∫-∞tK(t-s)g(v(s))ds],
where v(t)=(v1(t),…,vn(t))T, α(v(t))=diag(α1(v1(t)),…,αn(vn(t))) with αi(vi(t))=α~i(vi(t)+ui*), β(v(t))=(β1(v1(t)),…,βn(vn(t)))T with βi(vi(t))=β~i(vi(t)+ui*)-β~i(ui*), g(v(t))=(g1(v1(t)),…,gn(vn(t)))T, g(v(t-τ(t)))=(g1(v1(t-τ1(t))),…,gn(vn(t-τn(t))))T with gj(vj(t))=fj(vj(t)+uj*)-fj(uj*).
Theorem 9.
Under assumptions (H1)–(H3), if there exist positive diagonal matrices P=diag(p1,p2,…,pn), Q=diag(q1,q2,…,qn), and R=diag(r1,r2,…,rn) such that Ξ>0, where Ξ is defined by (10), then the equilibrium point of system (2) is globally robustly exponentially stable.
Proof.
Define a Lyapunov functional: V(t)=∑i=14Vi(t), where
(15)V1(t)=2heεt∑i=1n∫0vi(t)sαi(s)ds,V2(t)=2eεt∑i=1npi∫0vi(t)gi(s)αi(s)ds,V3(t)=∑i=1n(qi+∥PB*∥2)∫t-τi(t)teε(τ+s)gi2(vi(s))ds,V4(t)=∑i=1n(ri+∥PC*∥2)∫0+∞ki(ξ)eεξ∫t-ξteεsgi2(vi(s))dsdξ.
Calculating the derivative of V(t) along the trajectories of system (13), we obtain that
(16)V˙1(t)=2hεeεt∑i=1n∫0vi(t)sαi(s)ds-2heεt×∑i=1nvi(t)[βi(vi(t))-∑j=1naijgj(vj(t))-∑j=1nbijgj(vj(t-τj(t)))-∑j=1ncij∫-∞tkj(t-s)gj(vj(s))ds]≤hεeεt∑i=1n1α_ivi2(t)-2heεt∑i=1nγivi2(t)+2heεt×∑i=1nvi(t)[∑j=1naijgj(vj(t))+∑j=1nbijgj(vj(t-τj(t)))+∑j=1ncij∫-∞tkj(t-s)gj(vj(s))ds]=hεeεtvT(t)α_-1v(t)+2heεtvT(t)×[∫-∞t-Γv(t)+Ag(v(t))+Bg(v(t-τ(t)))+C∫-∞tK(t-s)g(v(s))ds],V˙2(t)=2εeεt∑i=1npi∫0vi(t)gi(s)αi(s)ds-2eεt×∑i=1npigi(vi(t))×[βi(vi(t))-∑j=1naijgj(vj(t))-∑j=1nbijgj(vj(t-τj(t)))-∑j=1ncij∫-∞tkj(t-s)gj(vj(s))ds]≤εeεt∑i=1npiliα_ivi2(t)-2eεt∑i=1npiγigi(vi(t))vi(t)+2eεt×∑i=1npigi(vi(t))×[∑j=1naijgj(vj(t))+∑j=1nbijgj(vj(t-τj(t)))+∑j=1ncij∫-∞tkj(t-s)gj(vj(s))ds]=εeεtvT(t)PLα_-1v(t)-2eεtgT(v(t))PΓv(t)+2eεtgT(v(t))×[∫-∞tPAg(v(t))+PBg(v(t-τ(t)))+PC∫-∞tK(t-s)g(v(s))ds]≤εeεtvT(t)PLα_-1v(t)-2eεtgT(v(t))PΓL-1g(v(t))+2eεtgT(v(t))PAg(v(t))+eεt∥PB*∥2×(11-δ∥g(v(t))∥22+(1-δ)∥g(v(t-τ(t)))∥22)+eεt∥PC*∥2×(∥g(v(t))∥22+∥∫-∞tK(t-s)g(v(s))ds∥22)+2eεtgT(v(t))PΔBg(v(t-τ(t)))+2eεtgT(v(t))PΔC∫-∞tK(t-s)g(v(s))ds,V˙3(t)=∑i=1n(qi+∥PB*∥2)×[eε(t+τ)gi2(vi(t))-(1-τ˙i(t))eε(t+τ-τi(t))gi2(vi(t-τi(t)))]≤eε(t+τ)gT(v(t))Qg(v(t))+eε(t+τ)∥PB*∥2∥g(v(t))∥22-(1-δ)eεtgT(v(t-τ(t)))Qg(v(t-τ(t)))-(1-δ)eεt∥PB*∥2∥g(v(t-τ(t)))∥22,V˙4(t)=∑i=1n(ri+∥PC*∥2)×∫0+∞ki(ξ)eε(t+ξ)[gi2(vi(t))-e-εξgi2(vi(t-ξ))]dξ=eεt∫0+∞ki(ξ)eεξdξgT(v(t))(R+∥PC*∥2I)g(v(t))-eεt∑i=1n(ri+∥PC*∥2)∫-∞tki(t-s)gi2(vi(s))ds≤eεt∫0+∞ki(ξ)eεξdξgT(v(t))(R+∥PC*∥2I)g(v(t))-eεt(∫-∞tK(t-s)g(v(s))ds)T(R+∥PC*∥2I)×(∫-∞tK(t-s)g(v(s))ds).
Therefore, one can deduce that
(17)V˙(t)≤εeεtvT(t)(hα_-1+PLα_-1)v(t)+eε(t+τ)gT(v(t))(Q+∥PB*∥2I)g(v(t))+eεt×∫0+∞ki(s)eεsdsgT(v(t))(R+∥PC*∥2I)g(v(t))-2heεtvT(t)Γv(t)-eεtgT(v(t))×(2PΓL-1-2PA-11-δ∥PB*∥2-∥PC*∥2)×g(v(t))-eεt(1-δ)gT(v(t-τ(t)))Qg(v(t-τ(t)))-eεt(∫-∞tK(t-s)g(v(s))ds)T×R(∫-∞tK(t-s)g(v(s))ds)+2heεtvT(t)×(∫-∞tAg(v(t))+Bg(v(t-τ(t)))+C∫-∞tK(t-s)g(v(s))ds)+2eεtgT(v(t))×(∫-∞tPΔBg(v(t-τ(t)))+PΔC∫-∞tK(t-s)g(v(s))ds)=εeεtvT(t)(hα_-1+PLα_-1)v(t)+eεt(eετ-1)gT(v(t))(Q+∥PB*∥2I)g(v(t))+eεt(∫0+∞ki(s)eεsds-1)gT(v(t))(R+∥PC*∥2I)×g(v(t))-eεtwT(t)Ψw(t),
where
(18)Ψ=(2hΓ-hA-hB-hC*Φ-S′-PΔB-PΔC**(1-δ)Q0***R),(19)w(t)=(∫-∞t(∫-∞tK(t-s)g(v(s))ds)TvT(t)gT(v(t))gT(v(t-τ(t)))×(∫-∞tK(t-s)g(v(s))ds)T)T.
Denote Υ=(ABC). From Lemma 6, Ψ>0 is equivalent to Θ-(h/2)ΥTΓ-1Υ>0, where Θ is defined in (12). By Lemma 7, we have Θ>0. Letting 0<h<2min1≤i≤n{γi}λmin(Θ)/λmax(ΥTΥ), we can derive that
(20)Θ-h2ΥTΓ-1Υ≥Θ-h2min1≤i≤n{γi}ΥTΥ>0,
which yields Ψ>0.
From assumption (H3), we can choose a constant ε3 sufficiently small satisfying 0<ε3<μ and
(21)∫0∞ki(s)eε3sds≤λmin(Ψ)2max1≤i≤n{ri+∥PC*∥2}+1,1≤i≤n.
Choosing 0<ε<min1≤i≤3{εi} with
(22)ε1=λmin(Ψ)max1≤i≤n{α_i-1(h+pili)},ε2=1τln(λmin(Ψ)2max1≤i≤n{qi+∥PB*∥2}+1),
we get
(23)V˙(t)≤ε1eεtvT(t)(hα_-1+PLα_-1)v(t)+eεt(eε2τ-1)gT(v(t))(Q+∥PB*∥2I)g(v(t))+eεt(∫0+∞ki(s)eε3sds-1)gT(v(t))×(R+∥PC*∥2I)g(v(t))-eεtwT(t)Ψw(t)<eεtλmin(Ψ)[vT(t)v(t)+gT(v(t))g(v(t))]-eεtwT(t)Ψw(t)≤0.
Consequently, V(t)≤V(0) for all t≥0.
On the other hand,
(24)V(0)=2heεt∑i=1n∫0vi(0)sαi(s)ds+2∑i=1npi∫0vi(0)gi(s)αi(s)ds+∑i=1n(qi+∥PB*∥2)×∫-τi(0)0eε(τ+s)gi2(vi(s))ds+∑i=1n(ri+∥PC*∥2)×∫0+∞ki(ξ)eεξ∫-ξ0eεsgi2(vi(s))dsdξ≤max1≤i≤n{hiα_i}∥v(0)∥22+max1≤i≤n{piliα_i}∥v(0)∥22+τeετmax1≤i≤n{li2(qi+∥PB*∥2)}×sup-τ≤θ≤0∥v(θ)∥22+12εmax1≤i≤n{li2}λmin(Ψ)sup-∞<θ≤0∥v(θ)∥22≤asup-∞<θ≤0∥v(θ)∥22,
where a=max1≤i≤n{hi/α_i}+max1≤i≤n{pili/α_i}+τeετmax1≤i≤n{li2(qi+∥PB*∥2)}+(1/2ε)max1≤i≤n{li2}λmin(Ψ). Hence, (h/max1≤i≤n{α¯i})eεt∥v(t)∥22≤V(t)≤V(0)≤asup-∞<θ≤0∥v(θ)∥22; that is,
(25)∥u(t)-u*∥2≤amax1≤i≤n{α¯i}h∥ϕ(θ)-u*∥2e-εt/2,t>0.
Combining Theorem 8, we get that system (2) is globally robustly exponentially stable. The proof is complete.
Remark 10.
Letting P=pI be a positive scalar matrix in Theorem 9, we can get a robust exponential stability criterion based on LMI.
Remark 11.
If α~i(ui(t))=1, β~i(ui(t))=diui(t), 0<d_i≤di≤d¯i, and cij=0, system (2) turns into the interval Hopfield neural networks (1), which was studied in [4–6]. It can be seen that the main results in [4] is a special case of Theorem 9. Therefore, the obtained results in this paper improve the results in [4–6]. Also, our results generalize some previous ones in [33–35] as mentioned in [4]. In addition, in [19], the authors dealt with the robust exponential stability of CGNNs with time-varying delays. However, the distributed delays were not taken into account. Therefore, our results in this paper are more general than those reported in [19].
Remark 12.
In previous works such as [6, 33–35], ∥B*∥2 is often used as a part to estimate the bounds for ∥B∥2. Considering that B* is a nonnegative matrix, we develop a new approach based on H-matrix theory. The obtained robust stability criterion is in terms of the matrices B* and B*T, which can reduce the conservativeness of the robust results to some extent.
4. Periodic Solutions of Interval CGNNs
In this section, we consider the periodic solutions of system (2), in which τi(t) and Ji(t) are continuously periodic functions with period ω; that is, τi(t+ω)=τi(t), Ji(t+ω)=Ji(t)(i=1,2,…,n).
Theorem 13.
Under assumptions (H1)–(H3), system (2) has an ω-periodic solution which is globally exponentially stable, if the following condition holds:
(H4) ℳ=Γ-D is a nonsingular M-matrix, where
(26)D=(dij)n×n,dij=lj(a^ij+b^ij1-δ+c^ij).
Proof.
Let ui(t,ϕ) and ui(t,ψ) be two solutions of system (2) with initial values ϕ,ψ∈C((-∞,0],Rn), respectively. Since ℳ is a nonsingular M-matrix, ℳT is also a nonsingular M-matrix. It is well known that there exists a positive vector p=(μ1,…,μn)T such that ℳTp>0; that is,
(27)μiγi-∑j=1nμjli(a^ji+b^ji1-δ+c^ji)>0,i=1,2,…,n.
We can choose a constant ε>0 sufficiently small such that
(28)Fi(ε)=μi(γi-εα_i)-∑j=1nμjli(a^ji+b^ji1-δeετ+c^ji∫0+∞ki(ξ)eεξdξ)>0,i=1,2,…,n.
Denote Xi(t)=|ui(t,ϕ)-ui(t,ψ)|. Define a Lyapunov functional
(29)W(t)=∑i=1nμi(W1i(t)+W2i(t)+W3i(t)),
where
(30)W1i(t)=eεtsgn(ui(t,ϕ)-ui(t,ψ))∫ui(t,ψ)ui(t,ϕ)1α~i(s)ds,W2i(t)=∑j=1nljb^ij1-δ∫t-τj(t)teε(τ+s)Xj(s)ds,W3i(t)=∑j=1nljc^ij∫0+∞kj(ξ)eεξ∫t-ξteεsXj(s)dsdξ.
Calculating the upper right derivative of W(t) along the solution of (2), together with assumptions (H1)–(H3) and (28), we can derive that
(31)D+W(t)≤eεt∑i=1nμi[(εα_i-γi)Xi(t)+∑j=1nlj(a^ij+b^ij1-δeετ+c^ij×∫0+∞kj(ξ)eεξdξa^ij+b^ij1-δeετ+c^ij)Xj(t)εα_i-γi]=-eεt∑i=1n[μi(γi-εα_i)-∑j=1nμjli(a^ji+b^ji1-δeετ+c^ji∫0+∞ki(ξ)eεξdξa^ji+b^ji1-δeετ)[μi(γi-εα_i)]Xi(t)=-eεt∑i=1nFi(ε)Xi(t)≤0.
Then, we have W(t)≤W(0) for t≥0. On the other hand, it can be readily seen that
(32)W(t)≥m0eεt∑i=1n|ui(t,ϕ)-ui(t,ψ)|,W(0)≤M0sups∈(-∞,0]∑i=1n|ϕi(s)-ψi(s)|,
in which
(33)m0=min1≤i≤n{μiα¯i},M0=max1≤i≤n{μiα_i+∑j=1n[ljb^ij(eετ-1)ε(1-δ)+ljc^ijε×(∫0+∞kj(ξ)eεξdξ-1)ljb^ij(eετ-1)ε(1-δ)]∑j=1n}.
Hence, m0eεt∑i=1n|ui(t,ϕ)-ui(t,ψ)|≤W(t)≤W(0)≤M0sups∈(-∞,0]∑i=1n|ϕi(s)-ψi(s)|. Let M=M0/m0, then
(34)∥u(t,ϕ)-u(t,ψ)∥1≤M∥ϕ-ψ∥1e-εt,t>0.
We can always choose a positive integer N such that e-εNωM≤1/2 and define a Poincaré mapping P:C→C by P(ϕ)=uω(ϕ). It follows from (34) that
(35)∥PNϕ-PNψ∥1≤12∥ϕ-ψ∥1,
which implies that PN is a contraction mapping. Thus, there exists a unique fixed point φ* such that PNφ*=φ*. Note that PN(Pφ*)=P(PNφ*)=Pφ*. It means that Pφ* is also a fixed point of PN, then Pφ*=φ*; that is, uω(φ*)=φ*. Obviously, if u(t,φ*) is the solution of (2) through (0,φ*), u(t+ω,φ*) is also a solution of (2) and ut+ω(φ*)=ut(uω(φ*))=ut(φ*) for t>0. This shows that u(t,φ*) is exactly an ω-periodic solution of system (2) and all other solutions of (2) exponentially converge to it as t→+∞. This completes the proof.
Remark 14.
The periodic oscillatory behavior of the neural networks is of great interest in many applications. For instance, this phenomena of periodic solutions for neural networks coincide with the fact that learning usually requires repetition and periodic sequences of neural impulse are also of fundament significance for the control of dynamic functions of the body such as heart beat and respiration which occur with great regularity.
Remark 15.
In [23], the authors studied the existence and attractivity of periodic solutions for two class of CGNNs with discrete time delays or finite distributed time delays, respectively. In this paper, we incorporated time-varying delays and infinite distributed delays into CGNNs and derived the uniqueness and global exponential stability of periodic solutions. In [24, 27], two classes of CGNNs with distributed delays were investigated, and sufficient conditions were established to guarantee the uniqueness and global exponential stability of periodic solutions of such networks by using Lyapunov functional and the properties of M-matrix, whereas, the time-varying delays were ignored in the models. Thus, our results effectually improve or complement the results in [23, 24, 27].
5. Numerical Simulation
In what follows, we give two examples to illustrate the results obtained in Sections 3 and 4.
Example 1.
In system (2), we choose
(36)A_=(-0.3-0.2-0.5-0.6),A¯=(0.30.20.20.1),B_=(-0.8-0.9-0.4-1),B¯=(0.50.60.71),C_=(-0.5-0.5-0.3-0.8),C¯=(0.50.60.31),α~i(ui(t))=2+sin(ui(t)),β~i(ui(t))=5ui(t),J1=-1,J2=-1.5,fj(x)=tanh(x),τj(t)=1-e-t2,kj(t)=te-t,i,j=1,2.
It is clear that γ1=γ2=5, l1=l2=1, τ=1, δ=0.5, μ=1. Using the optimization toolbox of Matlab and solving the optimization problem (10), we obtain
(37)p1=1.6495,p2=1.4828,q1=2,q2=2,r1=1.2143,r2=2.
By Theorem 9, system (2) is globally robustly exponentially stable. To illustrate the theoretical result, we present a simulation with
(38)A=(0.20.1-0.1-0.4),B=(0.10.50.60.5),C=(0.20.360.20.8).
We can find that the neuron vector u(t)=(u1(t),u2(t))T converges to the unique equilibrium point x*=(0.3031,0.4095)T (see Figure 1).
For system (2), we choose β~i(ui(t))=βui(t). In Figure 2, we exhibit a typical bifurcation and chaos diagrams when we fix other parameters as (36) and (38) and choose β as a bifurcation parameter (0.01≤β≤0.3). It clearly shows that system (2) admits rich dynamics including period-doubling bifurcation and chaos.
Time responses of the state variables u(t) with different initial values in Example 1.
Bifurcation diagrams of system (2), and these show the effect of the parameter β on the dynamic behavior.
Example 2.
In system (2), we take τ1(t)=τ2(t)=1, J1(t)=2+sint, J2(t)=cost, and the other parameters are the same as those in (36). One can obtain that ℳ=(3.6-2.6-2.22.4), which is a nonsingular M-matrix. According to Theorem 13, system (2) has a 2π-periodic solution which is globally exponentially stable. We present a simulation with the parameters in (38) (see Figure 3).
Time responses of the state variables u(t) and phase plot in space (t,u1,u2).
6. Conclusion
In this paper, we discussed a class of interval CGNNs with time-varying delays and infinite distributed delays. By employing H-matrix and M-matrix theory, Lyapunov functional method, and LMI approach, sufficient conditions were established for the existence, uniqueness, and global robust exponential stability of the equilibrium point and the periodic solution to the neural networks. It was shown that the obtained results improve or complement the previously published results. Numerical simulations demonstrated the main results and further showed that chaotic phenomena may occur for the system, which coincide with the fact of recognition character of human beings. On the other hand, it is well known that chaotic synchronization has been successfully applied to secure communication; chaotic behaviors of neural networks imply that they may be used to create secure communication systems.
Acknowledgment
This work was supported by the National Natural Science Foundation of China (11071254).
HanW.LiuY.WangL.Robust exponential stability of Markovian jumping neural networks with mode-dependent delay20101592529253510.1016/j.cnsns.2009.09.024MR2602738ZBL1222.93231KwonO. M.LeeS. M.ParkJ. H.Improved delay-dependent exponential stability for uncertain stochastic neural networks with time-varying delays201037410123212412-s2.0-7444908344710.1016/j.physleta.2010.01.007LiX.Global robust stability for stochastic interval neural networks with continuously distributed delays of neutral type2010215124370438410.1016/j.amc.2009.12.068MR2596114ZBL1196.34107ShaoJ.-L.HuangT.-Z.WangX.-P.Improved global robust exponential stability criteria for interval neural networks with time-varying delays2011381215587155932-s2.0-8005202462910.1016/j.eswa.2011.05.066ShaoJ.-L.HuangT.-Z.ZhouS.An analysis on global robust exponential stability of neural networks with time-varying delays2009727–9199319982-s2.0-6184916956910.1016/j.neucom.2008.11.023ShaoJ.-L.HuangT.-Z.ZhouS.Some improved criteria for global robust exponential stability of neural networks with time-varying delays201015123782379410.1016/j.cnsns.2010.02.002MR2652651ZBL1222.93175WangF.WuH.Mean square exponential stability and periodic solutions of stochastic interval neural networks with mixed time delays20107316–18325632632-s2.0-7865031146610.1016/j.neucom.2010.04.020ZhaoW.ZhuQ.New results of global robust exponential stability of neural networks with delays20101121190119710.1016/j.nonrwa.2009.01.008MR2571289ZBL1196.34098LiuH.OuaY.HuJ.Delay-dependent stability analysis for continuous-time BAM neural networks with Markovian jumping parameters201023315321PanJ.LiuX.ZhongS.Stability criteria for impulsive reaction-diffusion Cohen-Grossberg neural networks with time-varying delays2010519-101037105010.1016/j.mcm.2009.12.004MR2608890ZBL1198.35033TianJ.ZhongS.Improved delay-dependent stability criterion for neural networks with time-varying delay201121724102781028810.1016/j.amc.2011.05.029MR2806415ZBL1225.34080WangH.SongQ.DuanC.LMI criteria on exponential stability of BAM neural networks with both time-varying delays and general activation functions201081483785010.1016/j.matcom.2010.08.011MR2769535ZBL1204.92006ZhangX.WuS.LiK.Delay-dependent exponential stability for impulsive Cohen-Grossberg neural networks with time-varying delays and reaction-diffusion terms20111631524153210.1016/j.cnsns.2010.06.023MR2736829ZBL1221.35440FuX.LiX.LMI conditions for stability of impulsive stochastic Cohen-Grossberg neural networks with mixed delays201116143545410.1016/j.cnsns.2010.03.003MR2679194ZBL1221.34195LiK.Stability analysis for impulsive Cohen-Grossberg neural networks with time-varying delays and distributed delays20091052784279810.1016/j.nonrwa.2008.08.005MR2523241ZBL1162.92002ZhouB.SongQ.WangH.Global exponential stability of neural networks with discrete and distributed delays and general activation functions on time scales20117417314231502-s2.0-8005293458810.1016/j.neucom.2011.04.008BalasubramaniamP.AliM. S.Robust exponential stability of uncertain fuzzy Cohen-Grossberg neural networks with time-varying delays2010161460861810.1016/j.fss.2009.10.013MR2576589ZBL1185.68511SuW.ChenY.Global robust stability criteria of stochastic Cohen-Grossberg neural networks with discrete and distributed time-varying delays200914252052810.1016/j.cnsns.2007.09.001MR2458825ZBL1221.37196WangZ.ZhangH.YuW.Robust stability criteria for interval Cohen-Grossberg neural networks with time varying delay2009724–6110511102-s2.0-5814946454010.1016/j.neucom.2008.03.001HuangZ.XiaY.Exponential periodic attractor of impulsive BAM networks with finite distributed delays200939137338410.1016/j.chaos.2007.04.014MR2504572ZBL1197.34124TownleyS.IlchmannA.WeissM. G.McclementsW.RuizA. C.OwensD. H.Prätzel-WoltersD.Existence and learning of oscillations in recurrent neural networks20001112052142-s2.0-003364065810.1109/72.822523ChenX.SongQ.Global exponential stability of the periodic solution of delayed Cohen-Grossberg neural networks with discontinuous activations20107316–18309731042-s2.0-7864993370510.1016/j.neucom.2010.06.010LiC.-H.YangS.-Y.Existence and attractivity of periodic solutions to non-autonomous Cohen-Grossberg neural networks with time delays20094131235124410.1016/j.chaos.2008.05.005MR2537636ZBL1198.34139LiuQ.XuR.Periodic solutions of high-order Cohen-Grossberg neural networks with distributed delays20111672887289310.1016/j.cnsns.2010.10.002MR2772303ZBL1221.37215PanJ.ZhanY.On periodic solutions to a class of non-autonomously delayed reaction-diffusion neural networks201116141442210.1016/j.cnsns.2010.02.022MR2679192ZBL1221.35200XiangH.CaoJ.Exponential stability of periodic solution to Cohen-Grossberg-type BAM networks with time-varying delays2009727–9170217112-s2.0-6184913125010.1016/j.neucom.2008.07.006LiuQ.XuR.Periodic solutions of a Cohen-Grossberg-type BAM neural networks with distributed delays and impulses201220121764341810.1016/j.amc.2009.05.005MR2904541ZBL1244.93122PrincipeJ. C.KuoJ.-M.CelebiS.An analysis of the gamma memory in dynamics neural networks1994523313372-s2.0-002840127410.1109/72.279195ZhangJ.Global exponential stability of interval neural networks with variable delays200619111222122710.1016/j.aml.2006.01.005MR2250363ZBL1180.34083HornR. A.JohnsonC. R.1991Cambridge, Mass, USACambridge University Press10.1017/CBO9780511840371MR1091716ZhangH.WangZ.LiuD.Robust exponential stability of recurrent neural networks with multiple time-varying delays20075487307342-s2.0-3454790780210.1109/TCSII.2007.896799DuY.XuR.Global robust exponential stability analysis for interval neural networks with mixed delays201220121810.1155/2012/647231647231MR3004879ZBL1256.93079EnsariT.ArikS.New results for robust stability of dynamical neural networks with discrete time delays2010378592559302-s2.0-7795120472710.1016/j.eswa.2010.02.013OzcanN.ArikS.Global robust stability analysis of neural networks with multiple time delays200653116617610.1109/TCSI.2005.855724MR2212240SinghV.Improved global robust stability criterion for delayed neural networks200731122422910.1016/j.chaos.2005.09.050MR2263282ZBL1142.93400