Analysis on Passivity for Uncertain Neural Networks with Time-Varying Delays

1 School of Electrical Engineering, Chungbuk National University, 52 Naesudong-Ro, Cheongju 361-763, Republic of Korea 2Nonlinear Dynamics Group, Department of Electrical Engineering, Yeungnam University, 280 Daehak-Ro, Kyongsan 712-749, Republic of Korea 3 School of Electronics Engineering, Daegu University, Gyungsan 712-714, Republic of Korea 4Department of Biomedical Engineering, School of Medicine, Chungbuk National University, 52 Naesudong-Ro, Cheongju 361-763, Republic of Korea


Introduction
Neural networks are the networks of mutual elements that behave like biological neurons, which can be mathematically described by difference or differential equations.For this reason, during a few decades, neural networks have been extensively applied in many areas such as reconstruction of moving image, signal processing, the tasks of pattern recognition, associative memories, and fixed-point computations [1][2][3][4][5][6][7][8][9][10].Also, the stability analysis of the concerned neural networks is a very important and prerequisite job because the application of neural networks heavily depends on the dynamic behavior of equilibrium points.
On the other hand, we need to pay keen attention to a delay in the time and passivity.It is well known that timedelay is a natural concomitant of the finite speed of information processing in the implementation of the networks and often causes undesirable dynamic behaviors such as oscillation and instability of the networks.In various scientific and engineering problems, stability issues are often linked to the theory of dissipative systems.It postulates that the energy dissipated inside a dynamic system is less than the energy supplied from the external source [11].Based on the concept of energy, the passivity is the property of dynamical systems and describes the energy flow through the system.It is also an input/output characterization and related to Lyapunov method.In the field of nonlinear control, the concept of dissipativeness was firstly introduced by Willems [12] in the form of inequality including supply rate and the storage function.The main idea of passivity theory is that passive properties of a system can keep the system internally.Therefore, the study on passivity analysis for uncertain neural networks with time-delay has been widely investigated in [13][14][15][16][17][18][19] since parametric uncertainties, which sometimes affect the stability of systems, are also undesirable dynamics in the hardware implementation of neural networks due to the fact that the connection weights of the neurons are dependent on the values of certain resistances and capacitances including variations of fluctuations [20].In [16], two types of timevarying delays were considered in passivity analysis of uncertain neural networks.Recently, in [17], by considering some useful terms which were ignored in previous literatures and 2 Mathematical Problems in Engineering utilizing free-weighting matrix techniques, the enhancement of feasible region of passivity criteria was shown.In [18], by proposing a complete delay-decomposing approach and utilizing a segmentation technique, improved conditions for passivity of neural networks were presented.In the authors' previous work [19], some less conservative conditions for passivity of neural networks were derived by taking more information of states.All the works [13][14][15][16][17][18][19] show their advantages of the proposed methods via comparison of maximum delay bounds with the previous works since delay bounds for guaranteeing the passivity of the concerned networks are recognized as one of the most important index for checking the conservatism of criteria.Very recently, up to now, one of the most remarkable methods in reducing the conservatism of stability criteria is Wiritinger-based integral inequality [21] which reduced Jensen's gap effectively.Therefore, there are rooms for further improvements in passivity analysis of the neural networks with both time-delay and parameter uncertainties.
With this motivation mentioned above, in this paper, the problem on passivity for uncertain neural networks with time-varying delays is addressed.In Theorem 6, by utilizing Wiritinger-based integral inequality [21], a passivity condition for neural networks with time-varying delays and parameter uncertainties is introduced with the framework of LMIs.Based on the results of Theorem 6, a newly constructed Lyapunov-Krasovskii functional is introduced and further improved results will be derived in Theorem 7. Inspired by the work of [22,23], the reciprocally convex approach and some zero equality will be utilized in Theorems 6 and 7. Finally, through two numerical examples, it will be shown that Theorems 6 and 7 obtain the less conservative results.
Notation.Throughout this paper, the used notations are standard.R  is the -dimensional Euclidean vector space and R × denotes the set of all × real matrices.For symmetric matrices  and ,  >  means that the matrix - is positive definite, whereas  ≥  means that the matrix - is nonnegative.  , 0  , and 0 ⋅ denote  ×  identity matrix and  ×  and  ×  denote zero matrices, respectively.diag{⋅ ⋅ ⋅ } denotes the block diagonal matrix.For square matrix , sym{} means the sum of  and its symmetric matrix   ; that is, sym{}
It is assumed that the neuron activation functions satisfy the following condition.
Assumption 1.The neuron activation functions   (⋅) ( = 1, . . ., ) are continuous, bounded and satisfy where  +  and  −  are constants.From (5), if V = 0, then we have Also, the conditions (5) and ( 6) are, respectively, equivalent to The systems (1) can be rewritten as The objective of this paper is to investigate delaydependent passivity conditions for system (9).Before deriving our main results, the following definition and lemmas are introduced.
Definition 2. The system (1) is called passive if there exists a scalar  ≥ 0 such that for all   ≥ 0 and for all solution of (1) with (0) = 0.

Main Results
In this section, new passivity criteria for the system (9) will be proposed in Theorems 6 and 7.
Proof.Let us consider the following Lyapunov-Krasovskii functional candidate as where ] , The time-derivatives of  1 ,  2 , and  3 can be calculated as By the use of Lemma 3, V 4 is bounded as where Furthermore, if Ψ 2 > 0, then applying Lemma 4 to (22) leads to where 1/() = ℎ  /ℎ() and S 1 is any 2 × 2 matrix.By the use of Lemma 3 and Jensen's inequality [25], if Ψ 3 > 0, then V 5 can be bounded as where  2 is any  ×  matrix.
And an upper bound of V 6 with Jensen's inequality [25] can be obtained as Before calculating the estimation of V 7 , inspired by the work of [23], the following zero equalities with any symmetric matrices  1 and  2 are considered as a tool of reducing the conservatism of criterion Mathematical Problems in Engineering Adding ( 27) to V 7 can be obtained as Here, the bound of V 7 presented in ( 28) is valid when Ψ  > 0 ( = 4, 5) hold.
In the second place, an improved passivity criterion for the system (9) will be derived in Theorem 7 by utilizing modified  3 .The notations of several matrices are defined for simplicity: and other notations will be used in Theorem 7.
Theorem 7.For given positive scalars ℎ  , ℎ  and diagonal matrices  − = diag{ − 1 , . . .,  −  } and  + = diag{ + 1 , . . .,  +  }, the system (9) is passive for 0 ≤ ℎ() ≤ ℎ  and ḣ () ≤ ℎ  , if there exist positive scalars  and , positive diagonal matrices , any symmetric matrices   ∈ R × ( = 1, 2), and any matrices S 1 ∈ R 2×2 ,  2 ∈ R × satisfying the LMIs (16) and where Proof.By choosing  3 as a newly Lyapunov-Krasovskii functional is given by Its new upper bound can be calculated as where the following inequality     () 3 () were estimated by using Jensen's inequality.In the authors' future work, further improved stability or passivity criteria for neural networks with timevarying delays will be proposed by utilizing Lemma 3 in estimating other integral terms.Remark 9. Unlike Theorem 6, by utilizing V3 as one of the terms of Lyapunov-Krasovskii functional, some new cross terms such as

𝑥(𝑠)𝑑𝑠
are included, which may reduce the passivity criterion of Theorem 6.In the next section, the effectiveness of the proposed Lyapunov-Krasovsii functional will be shown by comparing maximum delay bounds which guarantee the passivity of the numerical examples.
Remark 10.When the information of ḣ () is unknown, then, Theorems 6 and 7 can provide passivity criteria for the system (9) by choosing Q 2 = 0.

Numerical Examples
In The results of the maximum delay bounds for guaranteeing the passivity of the above neural networks with different ℎ  obtained by Theorems 6 and 7 are listed in Table 1.
One can see that Theorem 6 for this example gives larger maximum delay bounds than those of [13][14][15]19].This indicates that the presented sufficient conditions relieve the conservativeness of the passivity caused by time-delay and parameter uncertainties.Furthermore, Theorem 7 provides larger delay bound than that of Theorem 6.This means that the newly constructed Lyapunov-Krasovskii functional plays an important role to reduce the conservatism of Theorem 6. (45) In Table 2, the results of the maximum allowable delay bound for guaranteeing passivity are compared with the existing works.From Table 2, it can be seen that the maximum delay bounds for guaranteeing the passivity of the above neural networks are significantly larger than those of [16][17][18].

Conclusions
In this paper, the two passivity criteria for neural networks with time-varying delays and parameter uncertainties have been proposed by the use of Lyapunov method and LMI framework.In Theorem 6, by constructing the suitable Lyapunov-Krasovskii functional and utilizing Wirtingerbased inequality, the sufficient condition for passivity of the concerned networks was derived.Based on the result of Theorem 6, the improved criterion for the networks was proposed in Theorem 7 by introducing the newly augmented Lyapunov-Krasovskii functional.Via two numerical examples that dealt with previous works, the improvements of the proposed passivity criteria have been successfully verified.
Based on the proposed methods, future works will focus on solving various problems such as state estimation [28,29], passivity analysis for neural networks [30], stabilization for BAM neural networks [31], synchronization for complex networks [32], stability analysis, and filtering for dynamic systems with time delays [33][34][35][36][37].Moreover, in [38], to reduce the conservatism of stability sufficient conditions, the triple integral forms of Lyapunov-Krasovskii functional was proposed and its effectiveness was shown.Thus, by grafting such approach onto the proposed idea of this paper, further improved results will be investigated in the near future.
*  is a delay-partitioning number.