We consider a general system of nonlinear ordinary differential equations of first order. The nonlinearities involve distributed delays in addition to the states. In turn, the distributed delays involve nonlinear functions of the different variables and states. An explicit bound for solutions is obtained under some rather reasonable conditions. Several special cases of this system may be found in neural network theory. As a direct application of our result it is shown how to obtain global existence and, more importantly, convergence to zero at an exponential rate in a certain norm. All these nonlinearities (including the activation functions) may be non-Lipschitz and unbounded.
1. Introduction
Of concern is the following system:
(1)xi′(t)=-ai(t)xi(t)+∑j=1mfij(t,xj(t),∫-∞tKij(t,s,xj(s))ds)+ci(t),
with continuous data xj(t)=x0j(t), t∈(-∞,0], coefficients ai(t)≥0, and inputs ci(t), i=1,…,m. The functions fij and Kij are nonlinear continuous functions. This is a general nonlinear version of several systems that arise in many applications (see [1–9] and Section 4 below).
The literature is very rich of works on the asymptotic behavior of solutions for special cases of system (1) (see for instance [10–19]). Here the integral terms represent some kind of distributed delays but discrete delays may be recovered as well by considering delta Dirac distributions. Different sufficient conditions on the coefficients, the functions, and the kernels have been established ensuring convergence to equilibrium or (uniform, global, and asymptotic) stability. In applications it is important to have global asymptotic stability at a very rapid rate like the exponential rate. Roughly speaking, it has been assumed that the coefficients ai(t) must dominate the coefficients of some “bad” similar terms that appear in the estimations. For the nonlinearities (activation functions), the first assumptions of boundedness, monotonicity, and differentiability have been all weakened to a Lipschitz condition. According to [8, 20] and other references, even this condition needs to be weakened further. Unfortunately, we can find only few papers on continuous but not Lipschitz continuous activation functions. Assumptions like partially Lipschitz and linear growth, α-inverse Hölder continuous or inverse Lipschitz, non-Lipschitz but bounded were used (see [16, 21, 22]).
For Hölder continuous activation functions we refer the reader to [23], where exponential stability was proved under some boundedness and monotonicity conditions on the activation functions and the coefficients form a Lyapunov diagonally stable matrix (see also [24, 25] for other results without these conditions).
There are, however, a good number of papers dealing with discontinuous activation functions under certain stronger conditions like M-Matrix, the LMI condition (linear matrix inequality) and some extra conditions on the matrices and growth conditions on the activation functions (see [20, 26–37]). Global asymptotic stability of periodic solutions have been investigated, for instance, in [38, 39].
Here we assume that the functions fij and Kij are (or bounded by) continuous monotone nondecreasing functions that are not necessarily Lipschitz continuous and they may be unbounded (like power type functions with powers bigger than one). We prove that, for sufficiently small initial data, solutions decay to zero exponentially.
The local existence and global existence are standard; see the Gronwall-type Lemma 1 below and the estimation in our theorem. However, the uniqueness of the equilibrium is not an issue here (even in case of constant coefficients) as we are concerned with convergence to zero rather than stability of equilibrium.
After the Preliminaries section, where we present our main hypotheses and the main lemma used in our proof, we state and prove the convergence result in Section 3. The section is ended by some corollaries and important remarks. In the last section we give an application, where this type of systems (or special cases of it) appears in real world problems.
2. Preliminaries
Our first hypothesis (H1) is
(2)|fij(t,xj(t),∫-∞tKij(t,s,xj(s))ds)|≤bij(t)|xj(t)|αij(∫-∞tlij(t-s)ψij(|xj(s)|)ds)βij,hhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhi,j=1,…,m,
where bij are nonnegative continuous functions, lij are nonnegative continuously differentiable functions, ψij are nonnegative nondecreasing continuous functions, and αij,βij≥0, i,j=1,…,m. The interesting cases are when αij and βij are all nonzero.
Let I⊂R, and let g1,g2:I→R∖{0}. We write g1∝g2 if g2/g1 is nondecreasing in I. This ordering as well as the monotonicity condition may be dropped as is mentioned in Remark 8 below.
Lemma 1 (see [40]).
Let a(t) be a positive continuous function in J∶=[α,β), kj(t), j=1,…,n nonnegative continuous functions for α≤t<β, gj(u), j=1,…,n nondecreasing continuous functions in R+, with gj(u)>0 for u>0, and u(t) a nonnegative continuous functions in J. If g1∝g2∝⋯∝gn in (0,∞), then the inequality
(3)u(t)≤a(t)+∑j=1n∫αtkj(s)gj(u(s))ds,t∈J,
implies that
(4)u(t)≤ωn(t),α≤t<β0,
where ω0(t)∶=sup0≤s≤ta(s),
(5)ωj(t)∶=Gj-1[Gj(ωj-1(t))+∫0tkj(s)ds],j=1,…,n,Gj(u)∶=∫ujudxgj(x),u>0(uj>0,j=1,…,n),
and β0 is chosen so that the functions ωj(t), j=1,…,n, are defined for α≤t<β0.
In our case we will need the following notation and hypotheses.
(H2) Assume that ψij(u)>0 for u>0 and the set of functions u(t)αij+βij, ψij(u(t)) may be ordered as h1∝h2∝⋯∝hn (after relabelling). Their corresponding coefficients b~ij(t)∶=exp[∫0ta(σ)dσ]bij(t)(a(t)∶=min1≤i≤mai(t)) and lij(0) will be renamed λk,k=1,…,n.
We define x(t)∶=∑i=1m|xi(t)|, t>0, x0(t)∶=∑i=1m|x0i(t)|, t≤0,
(6)c(t)∶=∫0texp[∫0sa(σ)dσ]∑i=1m|ci(s)|ds,t>0,ω0(t)∶=x0(0)+∑i,j=1m∫-∞0lij(-σ)ψij(x0(σ))dσ+c(t),ωj(t)∶=Hj-1[Hj(ωj-1(t))+∫αtλj(s)ds],j=1,…,n,Hj(u)∶=∫ujudxhj(x),u>0(uj>0,j=1,…,n),ω~0(t)∶=ω0(0)+∑i,j=1m∫0∞|lij′(s)|∫-s0ψij(u0(σ))dσds,u0(σ)=∑i,j=1m∫-∞σ|lij(σ-τ)|ψij(x0(τ))dτ,σ<0,ω~j(t)∶=Hj-1[Hj(ω~j-1(t))+∫0tλ~j(s)ds],j=1,…,n,
where λ~j are the relabelled coefficients corresponding to b~ij(t) and lij(0)+∫0∞|lij′(σ)|dσ.
3. Exponential Convergence
In this section it is proved that solutions converge to zero in an exponential manner provided that the initial data are small enough.
Theorem 2.
Assume that the hypotheses (H1) and (H2) hold and ∫-∞0lij(-σ)ψij(x0(σ))dσ<∞, i,j=1,…,m. Then, (a) if lij′(t)≤0,i,j=1,…,m, there exists β0>0 such that
(7)x(t)≤ωn(t)exp[-∫0ta(s)ds],0≤t<β0.
(b) If lij′(t),i,j=1,…,m are of arbitrary signs, lij′(t) are summable, and the integral term in ω~0(t) is convergent then there exists a β1>0 such that the conclusion in (a) is valid on 0≤t<β1 with ω~n instead of ωn.
Proof.
It is easy to see from (1) and the assumption (H1) that for t>0 and i=1,…,m we have
(8)D+|xi(t)|≤-ai(t)|xi(t)|+∑j=1m|fij(t,xj(t),∫-∞tKij(t,s,xj(s))ds)|+ci(t),
or, for t>0,
(9)D+x(t)≤-min1≤i≤m{ai(t)}x(t)+∑i,j=1mbij(t)|xj(t)|αij×(∫-∞tlij(t-s)ψij(|xj(s)|)ds)βij+∑i=1m|ci(t)|,
where D+ denotes the right Dini derivative. Hence
(10)D+x(t)≤-a(t)x(t)+∑i,j=1mbij(t)|x(t)|αij(∫-∞tlij(t-s)ψij(x(s))ds)βij+∑i=1m|ci(t)|,t>0
and consequently
(11)D+{x(t)exp[∫0ta(s)ds]}≤exp[∫0ta(s)ds]∑i,j=1mbij(t)|x(t)|αij×(∫-∞tlij(t-s)ψij(x(s))ds)βij+exp[∫0ta(s)ds]∑i=1m|ci(t)|,hhhhhhhhhhhhhhhhhhht>0.
Thus (by a comparison theorem in [41])
(12)x~(t)≤x(0)+c(t)+∑j=1m∫0t{(∫-∞slij(s-σ)ψij(x(σ))dσ)βij∑i=1mb~ij(s)|x(s)|αijhhhhhh×(∫-∞slij(s-σ)ψij(x(σ))dσ)βij}ds,hhhhhhhhhhhhhhhhhhhhhhhhhhhhhhht>0,
where
(13)x~(t)∶=x(t)exp[∫0ta(s)ds].
Let y(t) denote the right hand side of (12). Clearly x~(t)≤y(t), t>0, and for t>0(14)D+y(t)=D+c(t)+∑i,j=1mb~ij(t)|x(t)|αij×(∫-∞tlij(t-σ)ψij(x(σ))dσ)βij.
We designate by zij(t) the integral term in (14); that is,
(15)zij(t)∶=∫-∞tlij(t-σ)ψij(x(σ))dσ
and z(t)∶=∑i,j=1mzij(t). A differentiation of z(t) gives
(16)z′(t)=∑i,j=1mlij(0)ψij(x(t))+∑i,j=1m∫-∞tlij′(t-σ)ψij(x(σ))dσ.
(a) Consider lij′(t)≤0, i,j=1,…,m
In this situation (of fading memory) we see from (14) and (16) that if u(t)∶=y(t)+z(t), then
(17)D+u(t)≤D+c(t)+∑i,j=1m[b~ij(t)(u(t))αij+βij+lij(0)ψij(u(t))],hhhhhhhhhhhhhhhhhhhhhhhhhhhhhhlt>0.
Therefore
(18)u(t)≤u(0)+c(t)+∑i,j=1m∫0t[b~ij(s)(u(s))αij+βij+lij(0)ψij(u(s))]ds,hhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhht>0,
where u(0)=x(0)+∑i,j=1m∫-∞0lij(-σ)ψij(x0(σ))dσ. Now we can apply Lemma 1 to obtain
(19)x~(t)≤u(t)≤ωn(t),0≤t<β0
with ω0(t)=u(0)+c(t) and ωn(t) is as in the “Preliminaries” section.
(b) Consider lij′(t), i,j=1,…,m of arbitrary signs.
From expressions (14) and (16) we derive that
(20)D+u(t)≤D+c(t)+∑i,j=1m[b~ij(t)(u(t))αij+βij+lij(0)ψij(u(t))]+∑i,j=1m∫0∞|lij′(σ)|ψij(u(t-σ))dσ,t>0.
The derivative of the auxiliary function
(21)u~(t)=u(t)+∑i,j=1m∫0∞|lij′(s)|∫t-stψij(u(σ))dσds,t≥0
is equal to (with the help of (20) and (21))
(22)D+u~(t)=D+u(t)+∑i,j=1m∫0∞|lij′(s)|[ψij(u(t))-ψij(u(t-s))]dσds≤D+c(t)+∑i,j=1m[b~ij(t)(u(t))αij+βij+lij(0)ψij(u(t))]+∑i,j=1m∫0∞|lij′(σ)|ψij(u(t-σ))dσ+∑i,j=1m∫0∞|lij′(s)|[ψij(u(t))-ψij(u(t-s))]ds≤D+c(t)+∑i,j=1m{b~ij(t)(u~(t))αij+βij[lij(0)+∫0∞|lij′(s)|ds]ψij+[lij(0)+∫0∞|lij′(s)|ds]ψij(u~(t))},hhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhht>0.
Therefore
(23)u~(t)≤u~(0)+c(t)+∑i,j=1m∫0t{b~ij(s)(u~(s))αij+βij[lij(0)+∫0∞|lij′(σ)|dσ]hhhhhhhh+[lij(0)+∫0∞|lij′(σ)|dσ]ψij(u~(s))}ds
with
(24)u~(0)=x(0)+∑i,j=1m∫-∞0lij(-σ)ψij(x0(σ))dσ+∑i,j=1m∫0∞|lij′(s)|∫-s0ψij(u0(σ))dσds,u0(σ)=z(σ)=∑i,j=1mzij(σ)=∑i,j=1m∫-∞σlij(σ-τ)ψij(x0(τ))dτ,σ<0.
Applying Lemma 1 to (23) we obtain
(25)x~(t)≤u~(t)≤ω~n(t),0≤t<β1
and hence
(26)x~(t)≤ω~n(t),0≤t<β1,
where ω~0(t)∶=u~(0) and
(27)ω~j(t)∶=Hj-1[Hj(ω~j-1(t))+∫0tλ~j(s)ds],j=1,…,n,
and β0 is chosen so that the functions ω~j(t),j=1,…,n, are defined for 0≤t<β1.
Corollary 3.
If, in addition to the hypotheses of the theorem, we assume that
(28)∫0∞χk(s)ds≤∫ωk-1∞dzhk(z),k=1,…,n,χk(s)=λk(s),λ~k(s)
then we have global existence of solutions.
Corollary 4.
If, in addition to the hypotheses of the theorem, we assume that ωn(t) (ω~n(t)) grows up at the most polynomially (or just slower than exp[∫0ta(s)ds]), then solutions decay at an exponential rate if ∫0ta(s)ds→∞ as t→∞.
Corollary 5.
In addition to the hypotheses of the theorem, assume that lij′(t)≤Lijlij(t), i,j=1,…,m, for some positive constants Lij and ψij(t) are in the class H (that is ψij(αu)≤ξijψij(u), α>0, u>0, i,j=1,…,m). Then solutions are bounded by a function of the form exp[-(∫0ta(s)ds-Lt)], where l=max{Lij,i,j=1,…,m}.
Remark 6.
We have assumed that αij and βij are greater than one but the case when they are smaller than one may be treated similarly. When their sum is smaller than one we have global existence without adding any extra condition.
Remark 7.
The decay rate obtained in Corollary 5 is to be compared with the one in the theorem (case (b)). It appears that the estimation in Corollary 5 holds for more general initial data (not as small as the ones in case (b)). However, the decay rate is smaller than the one in (b) besides assuming that ∫0ta(s)ds-Lt→∞ as t→∞.
Remark 8.
If we consider the following new functions, then the monotonicity condition and the order imposed in the theorem may be dropped:
(29)ϕ1(t)∶=max0≤s≤tg1(s),ϕk(t)∶=max0≤s≤t{gk(s)ϕk-1(s)}ϕk-1(t)
and ψ(t)∶=ϕk(t)/ϕk-1(t).
4. Application
(Artificial) Neural networks are built in an attempt to perform different tasks just as the nervous system. Typically, a neural network consists of several layers (input layer, hidden layers, and output layer). Each layer contains one or more cells (neurons) with many connections between them. The cells in one layer receive inputs from the previous layer, make some transformations, and send the results to the cells of the subsequent layer.
One may encounter neural networks in many fields such as control, pattern matching, settlement of structures, classification of soil, supply chain management, engineering design, market segmentation, product analysis, market development forecasting, signature verification, bond rating, recognition of diseases, robust pattern detection, text mining, price forecast, botanical classification, and scheduling optimization.
Neural networks not only can perform many of the tasks a traditional computer can do, but also excel in, for instance, classifying incomplete or noisy data, predicting future events, and generalizing.
The system (1) is a general version of simpler systems that appear in neural network theory [1–9] like
(30)xi′(t)=-aixi(t)+∑j=1mfij(xj(t))+ci(t),
or
(31)xi′(t)=-aixi(t)+∑j=1m∫-∞tlij(t-s)fij(xj(s))ds+ci(t).
It is well established by now that (for constant coefficients and constant ci(t)) solutions converge in an exponential manner to the equilibrium. Notice that zero in our case is not an equilibrium. This equilibrium exists and is unique in case of Lipschitz continuity of the activation functions. In our case the system is much more general and the activation functions as well as the nonlinearities are not necessarily Lipschitz continuous. However, in case of Lipschitz continuity and existence of a unique equilibrium we expect to have exponential stability using the standard techniques at least when we start away from zero.
For the system
(32)xi′(t)=-aixi(t)+∑j=1mbij|xj(t)|αij(∫-∞tlij(t-s)ψij(|xj(s)|)ds)βij+ci(t),
(where ψij may be taken as power functions; see also Corollary 5) our theorem gives sufficient conditions guaranteeing the estimation
(33)x(t)≤ωn(t)exp[-∫0ta(s)ds],0≤t<β0.
Then, Corollaries 3 and 4 provide practical situations where we have global existence and decay to zero at an exponential rate.
Conflict of Interests
The author declares that there is no conflict of interests regarding the publication of this paper.
Acknowledgment
The author is grateful for the financial support and the facilities provided by King Fahd University of Petroleum and Minerals through Grant no. IN111052.
CaoJ.YuanK.LiH.-X.Global asymptotical stability of recurrent neural networks with multiple discrete delays and distributed delays20061761646165110.1109/TNN.2006.8814882-s2.0-33947139816CrespiB.Storage capacity of non-monotonic neurons199912101377138910.1016/S0893-6080(99)00074-X2-s2.0-0033485893de SandreG.FortiM.NistriP.PremoliA.Dynamical analysis of full-range cellular neural networks by exploiting differential variational inequalities20075481736174910.1109/TCSI.2007.902607MR23702842-s2.0-34547988902FengC.PlamondonR.On the stability analysis of delayed neural networks systems20011491181118810.1016/S0893-6080(01)00088-02-s2.0-0034801030HopfieldJ. J.Neural networks and physical systems with emergent collective computational abilities19827982554255810.1073/pnas.79.8.2554MR6520332-s2.0-0020118274HopfieldJ. J.TankD. W.Computing with neural circuits: a model1986233476462563310.1126/science.37552562-s2.0-0022504321InoueJ. I.Retrieval phase diagrams of non-monotonic Hopfield networks199629164815482610.1088/0305-4470/29/16/008MR14187752-s2.0-0000639431KoskoB.1991New Delhi, IndiaPrentice-Hall of IndiaYanaiH.-F.AmariS.-I.Auto-associative memory with two-stage dynamics of nonmonotonic neurons19967480381510.1109/72.5089252-s2.0-0242472478LiuX.JiangN.Robust stability analysis of generalized neural networks with multiple discrete delays and multiple distributed delays2009727–91789179610.1016/j.neucom.2008.06.0052-s2.0-61849092560MohamadS.GopalsamyK.AkçaH.Exponential stability of artificial neural networks with distributed delays and large impulses20089387288810.1016/j.nonrwa.2007.01.011MR23923822-s2.0-38949136712ParkJ.On global stability criterion for neural networks with discrete and distributed delays200630489790210.1016/j.chaos.2005.08.1472-s2.0-33745186884ParkJ.On global stability criterion of neural networks with continuously distributed delays200837244444910.1016/j.chaos.2006.09.0212-s2.0-40749144474QiangZ.Run-NianM. A.JinX.Global exponential convergence analysis of Hopfield neural networks with continuously distributed delays200339338138410.1088/0253-6102/39/3/381MR20022102-s2.0-0037445105WangY.XiongW.ZhouQ.XiaoB.YuY.Global exponential stability of cellular neural networks with continuously distributed delays and impulses20063501-2899510.1016/j.physleta.2005.10.0842-s2.0-30644467615WuH.Global exponential stability of Hopfield neural networks with delays and inverse Lipschitz neuron activations20091042297230610.1016/j.nonrwa.2008.04.016MR25084392-s2.0-61849163302ZhangQ.WeiX. P.XuJ.Global exponential stability of Hopfield neural networks with continuously distributed delays2003315643143610.1016/S0375-9601(03)01106-XMR20088932-s2.0-0042262759ZhaoH.Global asymptotic stability of Hopfield neural network involving distributed delays2004171475310.1016/S0893-6080(03)00077-72-s2.0-0347511634ZhouJ.LiS.YangZ.Global exponential stability of Hopfield neural networks with distributed delays20093331513152010.1016/j.apm.2008.02.006MR24785782-s2.0-55549116016GavaldàR.SiegelmannH. T.Discontinuities in recurrent neural networks199911371574510.1162/0899766993000166382-s2.0-0033112241WuH.TaoF.QinL.ShiR.HeL.Robust exponential stability for interval neural networks with delays and non-Lipschitz activation functions201166447948710.1007/s11071-010-9926-9MR28595792-s2.0-82255183314WuH.XueX.Stability analysis for neural networks with inverse Lipschitzian neuron activations and impulses200832112347235910.1016/j.apm.2007.09.002MR24397052-s2.0-47049108325FortiM.GrazziniM.NistriP.PancioniL.Generalized Lyapunov approach for convergence of neural networks with discontinuous or non-LIPschitz activations20062141889910.1016/j.physd.2005.12.006MR2200797ZBL1103.340442-s2.0-31144460192TatarN.-E.Hopfield neural networks with unbounded monotone activation functions20122012557135810.1155/2012/571358TatarN.-E.Control of systems with Hölder continuous functions in the distributed delays2014301123128BaoG.ZengZ.Analysis and design of associative memories based on recurrent neural network with discontinuous activation functions201277110110710.1016/j.neucom.2011.08.0262-s2.0-80955145744FortiM.NistriP.Global convergence of neural networks with discontinuous neuron activations200350111421143510.1109/TCSI.2003.818614MR20245692-s2.0-0242696111HuangY.ZhangH.WangZ.Dynamical stability analysis of multiple equilibrium points in time-varying delayed recurrent neural networks with discontinuous activation functions201291212810.1016/j.neucom.2012.02.0162-s2.0-84861193650LiL.HuangL.Dynamical behaviors of a class of recurrent neural networks with discontinuous neuron activations200933124326433610.1016/j.apm.2009.03.014MR25669832-s2.0-68049124938LiL.HuangL.Global asymptotic stability of delayed neural networks with discontinuous neuron activations20097216-183726373310.1016/j.neucom.2009.05.0162-s2.0-69249208666LiY.WuH.Global stability analysis for periodic solution in discontinuous neural networks with nonlinear growth activations2009200914798685MR251955910.1155/2009/798685LiuX.CaoJ.Robust state estimation for neural networks with discontinuous activations20104061425143710.1109/TSMCB.2009.20394782-s2.0-79953207030LiuJ.LiuX.XieW.-C.Global convergence of neural networks with mixed time-varying delays and discontinuous neuron activations20121839210510.1016/j.ins.2011.08.021MR28470192-s2.0-80055058410QinS.XueX.Global exponential stability and global convergence in finite time of neural networks with discontinuous activations200929318920410.1007/s11063-009-9103-72-s2.0-67649088057WangJ.HuangL.GuoZ.Global asymptotic stability of neural networks with discontinuous activations200922793193710.1016/j.neunet.2009.04.004ZBL1160.920022-s2.0-69449098824WangZ.HuangL.ZuoY.ZhangL.Global robust stability of time-delay systems with discontinuous activation functions under polytopic parameter uncertainties20104718910210.4134/BKMS.2010.47.1.089MR2604235ZBL1181.930672-s2.0-77749249629WuH.Global stability analysis of a general class of discontinuous neural networks with linear growth activation functions2009179193432344110.1016/j.ins.2009.06.006MR25743502-s2.0-67650945002CaiZ.HuangL.Existence and global asymptotic stability of periodic solution for discrete and distributed time-varying delayed neural networks with discontinuous activations201174173170317910.1016/j.neucom.2011.04.0272-s2.0-80052956138PapiniD.TaddeiV.Global exponential stability of the periodic solution of a delayed neural network with discontinuous activations20053431–311712810.1016/j.physleta.2005.06.0152-s2.0-21744455449PintoM.Integral inequalities of Bihari-type and applications1990333387403MR1086768LakshmikhantamV.LeelaS.196955-INew York, NY, USAAcademic PressMathematics in Sciences and Engineering, Edited by Bellman, R.