Generalized persistency of excitation

The paper presents the generalized persistency of excitation conditions. They are not only valid for much broader range of applications than their classical counterparts but also elegantly prove the validity of the latter. The novelty and the significance of the approach presented in this publication is due to employing the time averaging technique.


Introduction
The persistency of excitation conditions appear in numerous applications related to system identification, learning, adaptation, parameter estimation. They guarantee the convergence of the adaptation procedures based on the ideas of gradient and least-squares algorithms. An introduction into this topic can be found, e.g., in chapter 2 of the book [13]. The classical version of the persistency of excitation conditions can be characterized in terms of the asymptotic stability of the linear systeṁ x = −P (t)x, P (t) = P (t) T ≥ 0, ∀ t ≥ 0. (1) Namely, the linear system is uniformly asymptotically stable if P (t) is persistently exciting, i.e., there exist positive real numbers α, β, δ such that where I is the identity matrix. The detailed analysis of the conditions (2) can be found in many publications (see, e.g., [2], [4], [6], [7], [8], [9], [11], [13]). We address (2) as classical persistency of excitation conditions. They impose the restrictions that are uniform in time. Moreover, they tacitly demand the exponential convergence of the corresponding adaptation procedures. On the other hand, due to wide range of applications the classical conditions might be a burden for solutions of important problems. This publication presents the new generalized persistency of excitation conditions that do not impose any unnatural (uniform in time, exponential convergence) restrictions. Moreover, the classical version easily follows from the new generalized conditions.
In order to create the generalized version for the persistency of excitation this paper uses the approach similar in the spirit to the time-averaging developed in [1] and the technique originated in the theory of linear boundary value problems for partial differential equations (see , e.g., [14]). In other words, we introduce a generalized definition of a solution for an ordinary differential equation. That definition follows the widely accepted ideology of distributions. Then we formulate our new necessary and sufficient conditions for a time-varying system (1) to be asymptotically stable. Finally, we formulate corollaries of the main result and present an example illustrating the generalized persistency of excitation conditions.

Preliminaries
where the inequality P (t) ≥ 0 is understood in the following sense. Given two n × n symmetric matrices A and B we write Throughout the paper we assume that R n is equipped with the scalar product and x denotes the magnitude of x, i.e x = x, x , where x, x is the scalar product of x with itself.
We assume that P (t) is a time-dependent L 1,loc -matrix in the following sense. For any real numbers b > a and for any x, y ∈ R n we have b a | y, P (t)x | dt < ∞.
Consider the initial value probleṁ where P (t) ∈ L 1,loc . Its solution is defined to be an L 1,loc vector function x(t) such that for any infinitely differentiable function ϕ(t) ∈ C ∞ (both x(t) and ϕ(t) take its values from R n ) with compact support (that means ϕ(t) = 0 outside an interval from R) we have It is well-known (see e.g., [3], [12]) that the solution x(t, x 0 ) for (4) exists and unique. Moreover, it is a continuous function of time. Indeed, consider the Picard's sequence where y 0 (t) = x 0 and n = 1, 2, . . . . Since P (t) ∈ L 1,loc the Picard's sequence {y n (t)} n converges (point-wise) to a continuous function y(t). Let us show that y(t) satisfies (5). Integrating by parts Since both y n (t) and y n−1 (t) converge to y(t) as n → ∞ we arrive at Hence, y(t) = x(t, x 0 ) is the solution for (4) in the sense (5). The goal of this paper is to find necessary and sufficient conditions for the solution x(t, x 0 ) of the system (4) to satisfy: and the equilibrium x = 0 is stable.

Necessary and sufficient conditions
Our main goal is to study asymptotic stability of the origin for the system (3). Consider a real positive number S and a continuous real function ω S (t) such that ω S (t) > 0 for 0 ≤ t < S and ω S (t) = 0 for t ≥ S.
We also assume that ω S (t) is differentiable almost everywhere on R. For the sake of brevity, we address ω S (t) as a truncation function in the sequel. If the origin for the system (3) is asymptotically stable then Hence, for any fixed real positive number S we have Consider more closely the integral If we find conditions that guarantee then that will imply the asymptotic stability of the origin for the system (3).
and integrating by parts leads us to the following important formula where Let λ min (t, τ ), λ max (t, τ ) denote minimal and maximal eigenvalues of A(t, τ ). Then the next theorem gives us necessary and sufficient conditions for the system (3) to be asymptotically stable at the origin. Notice that one is assured by P (t) ∈ L 1,loc that the integral expressions in the next theorem are well defined. Proof. It follows from (6) that Due to monotonicity of x(t, x 0 ) 2 we have It follows from the very well known Gronwall inequality ( see, e.g., [10]) that for t ≥ S. After solving this inequality we obtain (3) is not asymptotically stable. Q.E.D.
Notice that we owe the success in proving Theorem 3.1 to the new idea that suggests to consider instead of x(t, x 0 ) 2 . This approach seems to have further important consequences not only for control theory but also for studies of general dynamical systems.
Then we can reformulate Theorem 3.1 as follows.  (3) is not asymptotically stable at the origin.
Though Corollary 3.3 gives us only a sufficient condition of asymptotic stability for the system (3) its simple form makes it valuable for practical applications.
At the conclusion of this section we present a simple and elegant proof for the classical persistency of excitation conditions. Corollary 3.4 (Classical Persistency of Excitation) If there exist real numbers α > 0, δ > 0 such that δ 0 P (t + s)ds ≥ αI ∀ t ≥ 0 (10) then the system (3) is asymptotically stable.
Proof. It follows from (10) that for the minimal eigenvalue γ min (ν, τ ) from Corollary 3.3 we have Hence, if we take S > δ then Q.E.D.
Hence, Corollary 3.3 implies that the system is persistently exciting.