Robust Stability Analysis of Fractional-Order Hopfield Neural Networks with Parameter Uncertainties

The issue of robust stability for fractional-order Hopfield neural networks with parameter uncertainties is investigated in this paper. For such neural system, its existence, uniqueness, and global Mittag-Leffler stability of the equilibrium point are analyzed by employing suitable Lyapunov functionals. Based on the fractional-order Lyapunov direct method, the sufficient conditions are proposed for the robust stability of the studied networks.Moreover, robust synchronization and quasi-synchronization between the class of neural networks are discussed. Furthermore, some numerical examples are given to show the effectiveness of our obtained theoretical results.


Introduction
During the last decades, neural networks have received increasing attention in various fields, such as image and signal processing, associative memory, pattern recognition, optimization, control, and modelling. The Hopfield neural model is one of the most popular neural models in the previous literature. Thus, for integer-order Hopfield neural networks, it is important to study their dynamical properties, in particular, the stability analysis of integer-order Hopfield neural networks which is a prime issue for the practical design and application of neural networks. In recent years, some sufficient conditions for global asymptotic and exponential stability of integer-order Hopfield neural networks have been proposed [1][2][3][4]. However, the above researches did not consider the influence of parameter uncertainties, which are unavoidable due to measure errors, the parameter fluctuation, external disturbance, and so forth. For example, in electronic implementation of neural networks, some essential parameters such as release rate of neurons, connection weights between the neurons, and the transmission delays might be subject to some deviations owing to the tolerances of electronic components. Therefore, it is necessary to ensure that system be stable with respect to these uncertainties in the design and applications of neural networks. In other words, the designed neural network must be robust against such uncertainties. Nowadays, many researchers have studied the existence, uniqueness, and globally robust asymptotic stability of the equilibrium point in integer-order neural networks with parameter uncertainties and given some robust stability conditions [5][6][7][8][9][10][11][12]. To establish the robust stability criteria, Lyapunov functionals and linear matrix inequality (LMI) are two effective tools utilized in [5][6][7][8] and [9][10][11][12], respectively.
It is well known that chaos synchronization has attracted considerable attention due to its great potential applications in secure communication, biological science, engineering, and so on. It is worth noting that integer-order neural networks can exhibit some complicated dynamics and even chaotic behavior if the parameters are appropriately chosen [13,14]. Thus, the synchronization for integer-order chaotic neural networks has been a much-discussed topic and the robust synchronization issue for integer-order chaotic neural networks with parameter uncertainties is also very hot [15,16]. Similar to the robust stability of integer-order neural networks, Lyapunov functionals and LMI are two commonly used tools to analyze the robust synchronization issue between two integer-order chaotic neural networks with parameter uncertainties. However, when the parameter uncertainties between two integer-order chaotic neural networks are different, parameter mismatches would appear unavoidably. In this case, the zero equilibrium point of the error system may not exist, and it is impossible to achieve 2 Mathematical Problems in Engineering the complete synchronization. But many researchers found that the synchronization error converges to a small region around zero which is called quasi-synchronization or weak synchronization and gained much research attention [17][18][19].
All the above researches investigated only integer-order neural networks. In fact, as a generalization of integerorder differentiation and integration to arbitrary noninteger order, fractional-order calculus exists in both theoretical and applied aspects of numerous branches of science and engineering. Especially, using fractional derivative, the description of some systems is more accurate than integerorder systems, such as viscoelastic materials, electrochemical processes, long lines, dielectric polarisations, coloured noise, and cardiac behaviour [20][21][22][23]. Therefore, neural networks have received growing interest of scholars to build their fractional-order Hopfield models and study the characteristics of them, which show higher nonlinearity and more degrees of freedom than integer-order models [24,25]. For the stability analysis of fractional-order neural networks, the linear stability theory of fractional-order system is the common method [26]. Moreover, in [26], Kaslik and Sivasundaram discussed the bifurcations and chaos of the nonlinear fractional-order Hopfield neural networks, which is very meaningful for synchronization between fractionalorder Hopfield neural networks. Sabatier et al. [27] proposed LMI stability conditions for fractional-order systems, which provides an approach to solving robust stability and synchronization issue of fractional-order systems with parameter uncertainties [28][29][30][31]. Liao et al. [28] studied robust stability for fractional-order linear time-invariant (FO-LTI) interval systems with parameter uncertainties. Wong et al. [31] investigated robust synchronization of fractional-order complex dynamical networks with parameter uncertainties. What a pity, no research discussed robust stability and synchronization for fractional-order Hopfield neural networks with parameter uncertainties. Besides, the above results are all in the form of LMI, and no scholar has ever employed the Lyapunov functional method which needs less calculation than LMI. Therefore, we try to deal with these problems and employ the Lyapunov functional to analyze the robust stability and synchronization for fractional-order Hopfield neural networks with parameter uncertainties.
It must be pointed out that Lyapunov stability theorem of integer-order systems cannot be used in fractional-order systems, until the fractional-order Lyapunov direct method was presented in [32]. The (generalized) Mittag-Leffler stability of fractional-order systems can be proved if the conditions of the fractional-order Lyapunov direct method are satisfied. So based on the fractional-order Lyapunov direct method, robust stability and synchronization for fractional-order systems can be analyzed by employing suitable Lyapunov functionals. Though for a multidimensional nonlinear fractionalorder system, a suitable Lyapunov functional is difficultly designed with satisfying the conditions of the fractionalorder Lyapunov direct method; one will be chosen and its suitability will be proved in this paper.
Motivated by the above discussion, robust stability and synchronization for fractional-order Hopfield neural networks with parameter uncertainties are studied. First, the globally robust stability of such neural system is analyzed via a Lyapunov functional, which has less calculation than LMI in early researches [28][29][30][31]. Besides, the Lyapunov functional is also employed to realize robust synchronization between two such neural systems with same parameter uncertainties. Moreover, the quasi-synchronization for this class of neural networks with different parameter uncertainties is investigated by a special Lyapunov functional, which is much simpler than the Laplace transform method used in [33]. In addition, numerical simulations are proposed to show their good agreement with the theoretical results.
The rest of the paper is organized as follows. In Section 2, some preliminaries are given, including Caputo fractionalorder derivative, Mittag-Leffler function, and Mittag-Leffler stability of fractional-order systems. Then, robust stability for fractional-order Hopfield neural networks with parameter uncertainties is proposed in Section 3. Robust synchronization and quasi-synchronization for fractional-order Hopfield neural networks with parameter uncertainties are presented in Section 4. Some examples in Section 5 are gained to verify the theoretical results. Finally, the paper is concluded in Section 6.

Caputo Fractional-Order Derivative.
As an important role in nonlinear science, fractional-order calculus has three common definitions, such as Grunwald-Letnikov, Riemann-Liouville, and Caputo definitions [34]. Especially, Caputo fractional-order derivative owns same initial conditions with integer-order derivatives, which is well-understood in physical situations and more applicable to real world problems. Thus, we use the Caputo fractional-order derivative in this paper.
Definition 1 (Caputo fractional-order derivative). The Caputo fractional-order derivative of order > 0 for a function ( ) ∈ +1 ([ 0 , +∞), ) is defined as where Γ(⋅) denotes the Gamma function and is a positive integer such that − 1 < ≤ . The Laplace transform of the Caputo fractional-order derivative is where L{⋅} denotes the Laplace transform and is the variable in Laplace domain.

Mittag-Leffler Function.
Mittag-Leffler function is frequently used in the solutions of fractional-order differential equations, which is similar to exponential function used in the solutions of integer-order differential equations.
Definition 2 (see [34]). The Mittag-Leffler function with two parameters is defined as where > 0, > 0, and ∈ . When = 1, its one-parameter form is shown as In particular, 1,1 ( ) = . The Laplace transform of Mittag-Leffler function with two parameters is where ≥ 0, is the variable in Laplace domain, and Re( ) is the real part of , ∈ .

Definition 3. The constant
is an equilibrium point of Caputo fractional-order dynamic system (7) if and only if ( , ) = 0.
Remark 4. Based on Properties 1 and 2, any equilibrium point can be translated to the origin via a change of variables. When the equilibrium point in (7) is ̸ = 0, system (7) where ( , 0) = 0 and the new system has equilibrium point at the origin for new variable . Thus, without loss of generality, all definitions and theorems are gained for the cases when the equilibrium point is the origin; that is, = 0. [35]). There exists a unique solution of system (7), if system (7) has equilibrium point at the origin and ( , ) satisfies locally Lipschitz condition on . Definition 6 (Mittag-Leffler stability [32]). If = 0 is an equilibrium point of system (7), the solution of (7) is said to be Mittag-Leffler stable if
In order to analyze Mittag-Leffler stability of system (7), the fractional-order Lyapunov direct method is introduced as follows.
Lemma 10 (fractional-order Lyapunov direct method [32]). For 0 = 0, the fractional-order system (7) is Mittag-Leffler stable at the equilibrium point = 0 if there exists a continuously differentiable function ( , ( )) that satisfies where ( , ( )) : [0, ∞) × → satisfies locally Lipschitz condition on ; ⊂ is a domain containing the origin; ≥ 0, ∈ (0, 1), 1 , 2 , 3 , , and are arbitrary positive constants. If the assumptions hold globally on , then = 0 is globally Mittag-Leffler stable. Remark 11. According to the proof of fractional-order Lyapunov direct method in [32], the condition in Lemma 10 can be weakened. If the inequality in (13) holds almost everywhere, the result of Lemma 10 also holds. Lemma 12. If ℎ( ) ∈ 1 ([0, +∞), ) denotes a continuously differentiable function, the following inequality holds almost everywhere: Proof. Without loss of generality, the trajectory of |ℎ( )| can be simply described as the solid line in Figure 1. The |ℎ( )| is differentiable except at the points 3 and 5 , which are the solutions of ℎ( ) = 0. The trajectory is divided into three parts by the points 3 and 5 , and 2 and 4 are the extreme points. Point 1 is the initial date |ℎ(0)|. The dash line is the trajectory of −|ℎ( )| which may be the one of ℎ( ) in any part. ( , ) denotes the coordinate of point in Figure 1.

Robust Stability for Fractional-Order Hopfield Neural Networks with Parameter Uncertainties
Consider the following n-dimensional Caputo fractionalorder Hopfield neural networks: where In the rest of this paper, ‖ ‖ denotes the 1-norm of corresponding vector or matrix . When , > (≥) means − > (≥)0. Based on the above formulations, the concept of robust stability for fractional-order Hopfield neural networks with parameter uncertainties is introduced by the following definition.
Definition 13. The fractional-order Hopfield neural networks (18) are globally asymptotically robust stable if the unique equilibrium point of the neural system is globally asymptotically stable under the time-varying parameter uncertainties.
In order to obtain globally asymptotically robust stability of system (18), three assumptions are given as follows. ( 2 ) Activation functions are continuous and satisfy Lipschitz condition on with Lipschitz constant > 0; that is, for all , ∈ and = 1, 2, . . . , , which ensures the existence and uniqueness of the solutions of system (18) based on Lemma 5.
Secondly, let us prove that system (18) is globally asymptotically robust stable. Let ( ) and ( ) be any two solutions of system (37) Construct a Lyapunov functional as It is obvious that the Lyapunov functional (30) satisfies the condition as the inequality in (12). Then, we are going to prove that the Lyapunov functional (30) also satisfies the condition as the inequality in (13) Thus, based on Lemma 10, the error system (29) is globally Mittag-Leffler stable; that is, Due to the fact that is the unique equilibrium of system (37), we have for any solution of system (37). Therefore, the unique equilibrium point of system (18) is globally Mittag-Leffler stable and the fractional-order Hopfield neural networks (18) with parameter uncertainties are globally asymptotically robust stable.
Remark 15. In Theorem 14, the robust stability of system (18) is deduced from its Mittag-Leffler stability. Based on the fractional-order Lyapunov direct method, the global Mittag-Leffler stability of system (18) can be guaranteed by employing the Lyapunov functional techniques. This method which is very convenient to implement in practice can be applied to almost all neural networks with the uniform Lipschitz activation functions. Moreover, it should be noted that the convergence rate is affected by the parameter in Mittag-Leffler stability. In Theorem 14, the unique equilibrium point is faster converged to with a larger .

Synchronization for Fractional-Order Hopfield Neural Networks with Parameter Uncertainties
In this section, synchronization for fractional-order Hopfield neural networks with parameter uncertainties is investigated. For same or different parameter uncertainties, robust Mathematical Problems in Engineering 7 synchronization and quasi-synchronization for fractionalorder Hopfield neural networks are studied by employing the chosen Lyapunov functionals.

Remark 21.
If the control parameter gets larger, the error bound = / + will become smaller. So the synchronization error bound can be made as small as we want by choosing suitable control parameters. It is very important to chaos synchronization and nonlinear systems control.
Mathematical Problems in Engineering

Numerical Simulations
The effectiveness of the obtained theoretical results is demonstrated by the following examples. The predictor-corrector scheme is used for the approximate numerical solutions of the fractional-order neural networks.

Remark 22.
In [37], authors pointed out that convergent rate is getting bigger with the increasing of fractional-order . From Figures 3 and 5, the convergent speed with = 0.9 is faster than the one with = 0.6, which coincides with their results.

Conclusion
By employing appropriate Lyapunov functionals, robust stability and synchronization for fractional-order Hopfield neural networks with parameter uncertainties are studied. Based on the fractional-order Lyapunov direct method, the sufficient condition of the existence, uniqueness, and globally robust stability of the equilibrium point is presented. Moreover, the sufficient condition of the robust synchronization between such neural systems with same parameter uncertainties is proposed owing to the robust stability analysis of its synchronization error system. In addition, for different parameter uncertainties, the quasi-synchronization between the class of neural networks is investigated with linear control. And the quasi-synchronization error bound can be controlled as we want by choosing suitable control parameters. The effectiveness of theoretical results is verified by the numerical simulations.
For the fractional-order delayed neural networks (FDNN), we try to discuss the robust stability and synchronization of FDNN with parameter uncertainties. Furthermore, the issue of robust stability for such system with stochastic external inputs is very novel and interesting, which is an important topic in our next work.