On the Asymptotical and Practical Stability of Stochastic Control Systems

The asymptotical and practical stability in probability of stochastic control systems by means of feedback laws is provided. The main results of this work enable us to derive the sufficient conditions for the existence of control Lyapunov function that play a leading role in the existence of stabilizing feedback laws. Particularly, the sufficient conditions for practical stability in probability are established and numerical examples are also given to illustrate the usefulness of our results.


Introduction
The stabilization of various types of linear and nonlinear systems has been widely studied in the past years (see, for instance, Karafyllis and Tsinias [1], Phat et al. [2], Thuan et al. [3], Bay et al. [4], and Trinh and Fernando [5]).In these papers, the authors derived Lyapunov-Krasovskii functions and established the necessary and sufficient conditions for robust global asymptotic stability and robust  ∞ stability at the equilibrium state of linear and nonlinear systems.
Stabilization of stochastic control systems (SCSs) by means of state feedback laws is important in the control theory.The stochastic version of the Lyapunov theorem has been used to derive necessary and sufficient conditions for stabilization of SCSs at their equilibrium state.In recent years, the stabilizability of various types of SCSs has been studied for different concepts of stochastic stability (see, for instance, [6][7][8][9][10][11][12][13]).
Florchinger [6] established the necessary and sufficient conditions for the asymptotic stability in probability of the SCSs at their equilibrium state.Under these conditions, Deng and Krstić [7] designed an inverse optimal control law for strict-feedback systems which guarantees global asymptotic stability in probability (GASP).Moreover, Deng et al. [8] also developed the notion of uniform in time GASP to the problem of feedback stabilization for a class of SCSs.On the other hand, the concept of design and analysis of controller for SCS has been introduced by Xie and Tian [9], Pan and Bas ¸ar [10], Lin et al. [11], and Abedi et al. [13].Necessary and sufficient conditions are derived by Abedi et al. [12] for nonuniform in time GASP for a class of SCSs.
Thus, our aim in this paper is to explore further the asymptotical stabilization in probability problem for a larger class of SCSs than that described in [6,13].This class of SCSs can be characterized in terms of computable control Lyapunov functions (CLFs) which depend on the system's coefficients.In addition, the paper is also intended to fill the gap of the previous works by establishing sufficient conditions for practical stability in probability for this broader class of SCSs.Our main result, Theorem 14, which is an extension of Theorem 2.1 established in Tsinias [14], asserts that this broader class of SCSs is asymptotically stable in probability (ASP), if it admits a CLF at the origin.In addition, we obtain a computable stabilizing feedback law under Propositions 15 and 16 and Theorems 18 and 19 that are extension of Propositions 1.5 and 2.3 and Theorems 2.4 and 2.6, respectively, proved by Tsinias [14] for deterministic control systems to the larger class of SCSs driven by a Wiener process.For some backgrounds related to this paper, readers may refer to the paper by Florchinger [6] that gives necessary and sufficient conditions for ASP of a special case of our SCSs.Both of the results and the methods used in this paper, however, are different from those in the references.The main tools used in this paper are, indeed, the stochastic versions of converse Lyapunov theorems established in Kushner [15].
The paper is organized as follows.In Section 2, we introduce the class of stochastic systems, some basic definitions, and results that we are dealing with in this paper.Section 3 describes a broader class of SCSs and is focused on the properties of stochastic CLF which play an important role in the stabilization theory.In Section 4, we state and prove the main results of the paper on the asymptotical and practical stabilization of the larger class of SCSs.Finally in Section 5 we provide some numerical examples to validate our results.

Stochastic Stability
In the following we introduce the class of stochastic systems and recall some definitions of ASP and GASP that we are dealing with in the rest of the paper.
A detailed exposition on the subject can be found in the books of Speyer and Chung [16], Has'minskiȋ [17], and also the paper by Abedi et al. [13].
Let (Ω, , ) be a complete probability space, and denote by (  ) ≥0 a standard   -valued Wiener process defined on this space.
Consider the stochastic process solution   ∈   of the Ito stochastic differential system as where (i)  0 ∈   is given, (ii)  :   →   and ℎ  :   →  × , 1 ≤  ≤ , are locally Lipschitz functionals mapping with (0) = 0, ℎ  (0) = 0 and there exists a constant  ≥ 0 such that for any  ∈   , the following linear growth condition holds: Notations.Throughout this paper we adopt the following notation.
(i) For any  ≥ 0 and  ∈   ,  ,  , where  ≤ , denotes the solution of the Ito stochastic differential system (1) starting from the state at time .
(iii) For any  ∈   ,   denotes its transpose.Definition 1.The equilibrium   = 0 of the system (1) is (i) stable in probability, if for every  ≥ 0 and  > 0, (ii) GASP, if it is globally stable in probability and

Preliminary Results
In this section, we introduce a broader class of SCSs and focus on the properties of CLF and control Lyapunov family which play an important role in asymptotical and practical stability in probability, respectively, in Section 4. A detailed exposition on the subject can be found in the paper of Abedi et al. [13].
Let (Ω, , ) be a complete probability space, and denote by (  ) ≥0 a standard   -valued Wiener process defined on this space.
Consider the following stochastic process solution   ∈   of the SCS: where is uniquely defined, (ii) the equilibrium solution   ≡ 0 of the closed-loop system (9) is ASP.
Moreover, SCS ( 7) is said to be practically stabilizable in probability at the origin by means of the family of feedback laws {  ,  > 0}, if for any sufficiently small  and for any  0 near the open sphere (0, ) of radius  around origin, the corresponding trajectory   (,  0 ) of the resulting closedloop system enters (0, ) after some time  =  < +∞ and it stays in this region thereafter.The concept of practical stability is first introduced by La Salle and Lefschetz [18].Denote by D the infinitesimal generator of the stochastic process solution of the uncontrolled part of SCS (7); that is, D is the second order differential operator defined for any function Φ in  2 (  , ) by For any  ∈ (1, . . ., ), let D  be the second order differential operator defined for any function Φ in  2 (  , ) and is given by and, for any ,  ∈ (1, . . ., ), D , denotes the second order differential operator defined for any function Φ in  2 (  , ) by We also denote D 0 as the infinitesimal generator for the stochastic process solution of the closed-loop system (9); that is, D 0 is the differential operator defined for any function Φ in  2 (  , ) given by Using these notations, we recall some notions of CLF as follows.
Definition 4. The SCS ( 7) is said to satisfy a stochastic Lyapunov condition at the origin if there exists a neighborhood  of the origin in   and a  2 positive definite function Φ :  →  + , and a positive definite function , such that for all  ∈  − {0} the following condition holds: Definition 5. A continuously differentiable real function Φ is called a CLF, if it is positive definite and satisfies condition (15).Definition 6.The CLF "Φ" is said to satisfy the bounded control property, if there exists a positive real function  :  →  such that  is bounded on  and for every  ∈  − {0} there exists a control  ∈   satisfying the following inequalities: If in addition lim  → 0 () = 0, then the Φ is said to satisfy the small control property.
A useful tool to study the stabilizability of SCS is the stochastic version of Artstein's theorem [19], established by Florchinger [6], and is given in the following theorem.

Theorem 7. (i) The SCS (7) satisfies a stochastic Lyapunov condition at the origin, if and only if it is asymptotically
stabilizable by means of a feedback law  = (), which is smooth in a neighborhood of the origin except possibly at  = 0.
(ii) The corresponding CLF, Φ, satisfies the bounded control property, if and only if there exists a stabilizer  = (), which is smooth in a neighborhood  of the origin, except possibly at zero and satisfies where  is defined in (16).This implies that the function  is bounded in a neighborhood of the origin and if, in addition, Φ satisfies the small control property then () → 0 as  → 0.
In the following we introduce the stochastic version of Florchinger control law [20] established by Abedi et al. [12], which gives the property of GASP for the resulting closedloop system (9).Theorem 8. Consider the SCS (7) and its corresponding closed-loop system (7).Let Φ be a CLF associated with the system, and for any  ∈   , denote by () the function defined by Then the feedback law where () is given by (19) and guarantees that the resulting closed-loop system (9) is GASP.
Under a slight change of assumption, that is on the property of small control family, introduced by Tsinias [14] we give the definitions on small control family, in the notion of stochastic, as follows.
Definition 9.The SCS ( 7) is said to satisfy a stochastic practical Lyapunov condition at the origin if there exist compact neighborhoods  and   of the origin in   such that (0, ) ⊂   , a family of  2 positive definite function Φ  :   → , where  > 0, and a positive definite function , such that for any sufficiently small , condition (15) holds with Φ = Φ  , and min {Φ  () ,  ∈   } > max {Φ  () ,  ∈  (0, )} , (22) where  is the boundary of .Moreover, for any sequences {  } ⊂  + and {   } ⊂  with lim   = 0 and lim    =  as  → +∞, we have (23) Definition 10.A continuously differentiable real function Φ  is regarded as a member of control Lyapunov family, if it is positive definite and is satisfying the conditions given in Definition 9.
Definition 11.The control Lyapunov family {Φ  } is said to satisfy the stochastic bounded control property, if there exists a positive real function  :  →  such that  is bounded on , and a family of nonnegative real numbers   , where  > 0 such that   → 0 as  → 0 and for any sufficiently small , conditions ( 17), (22), and (23) hold with Φ = Φ  ,  ∈  2 −  1 , and for some  ∈  1 where ‖‖ <  () +   . (24) If, in addition, () → 0 as  → 0, then the control Lyapunov family {Φ  } is said to satisfy the stochastic small control property.
The following theorem presents sufficient conditions for practical stabilization.The proof of this theorem is similar to the proof of Theorem 7.
Theorem 12. (i) If the SCS (7) satisfies the stochastic practical Lyapunov condition, then it is practically stabilizable in probability at the origin by means of a family of smooth feedback laws  = {  ()}.
(ii) If the control Lyapunov family {Φ  } satisfies the stochastic bounded (small) control property, then for any  the corresponding stabilizer  satisfies the following inequality: where  and   are those given in Definition 11.
Remark 13.A direct result of Definitions 4 and 9 is that if the SCS (7) satisfies the stochastic Lyapunov condition, it will also satisfy the stochastic practical Lyapunov condition.
In the methodology used in this paper, we will consider the SCS (7) in the differential form as where , and  2 are  ∞ functionals vanishing at the origin, and we will derive the sufficient conditions for the existence of CLF guaranteeing stabilization (Theorem 14) in the form of (26).We extend the similar decomposition used by Tsinias [14] in deterministic case to the stochastic case.Thus, the study of asymptotical and practical stability in probability at the equilibrium state of the SCS (7) can be replaced by the study of asymptotical and practical stability in probability at the equilibrium state of the SCS (26).Special emphasis is given to asymptotical and practical stabilization in probability for SCS (26), whose control terms  2 and  2 are constants; that is, the system has the form where  = (  1  2 ) ∈   1 ×   2 and  is an   2 -valued control law.

Main Results
The aim of this section is to derive sufficient conditions for the existence of CLFs that play a leading role in the existence of stabilizing feedback laws that are smooth, except possibly at the equilibrium state of the system (Theorem 14).The formulas for the special case of SCS (7) with unity intensity noise and vanishing  , () without the term ℎ  () were given by Abedi et al. [13] and for the system (7) with affinity in the noise and control , where ℎ  () may be nonvanishing at the origin, without the term  , ()   were given by Deng et al. [8].In [12], Abedi et al. derived the necessary and sufficient conditions for nonuniform in time GASP for the system (7) in the case where   () and  , () may be nonvanishing at the origin.Here, we extend Propositions 1.5 and 2.3 and Theorems 2.4 and 2.6 of Tsinias [14] in deterministic case and obtain a computable stabilizing feedback law as Propositions 15 and 16 and Theorems 18 and 19, respectively, for a broader class of SCS (26).
Consider the following lower-dimensional subsystems of the SCS (26): Then, we can present the following result, where the stochastic Lyapunov condition for the SCS (26) is characterized in terms of some appropriate positive functions for the SCS (28) and (29).
Theorem 14. Suppose that there exist neighborhoods  and  1 of the origin in   and   1 , respectively, and mappings  :  →   ,  :  1 →   2 , and  :  →  such that (0) = 0 where  is continuous, and  is continuously differentiable, and let us denote for  in a neighborhood of the origin in   then the function Φ satisfies the small control property.
Proof.Part (i): since the equilibrium solution  1 ≡ 0 of the closed-loop system is asymptotically stable, then the converse Lyapunov theorem of Kushner [15] asserts that there exists a smooth Lyapunov function  :   1 →  such that (0) = 0, ( 1 ) > 0 and for  1 ̸ = 0 near zero, where H denotes the infinitesimal generator for the stochastic process solution of the closed-loop system (33).Obviously, by the assumption of the theorem, the function is positive definite, and for any  ∈ ( ×   2 ) ∩ , we have For any  ̸ = 0 such that (()/ 2 ) 2 () = (()/ 2 )  2 () = 0, it follows by the assumption that  2 = ( 1 ).Hence, and so where G is the infinitesimal generator for the stochastic process solution of the uncontrolled part of the SCS (26).Therefore, the SCS (26) satisfies a stochastic Lyapunov condition at the origin, and the function Φ() = ( and it follows that Therefore, for all  ̸ = 0 near zero.Inequality (46) in conjunction with condition ∇ 2 ( 1 , ( 1 )) = 0 gives Since  is bounded and  is continuous with (0) = 0, it follows that Φ satisfies the small control property.
Proposition 15.If the SCS (7) is asymptotically stabilizable in probability by means of the feedback  = (), which is continuous in a neighborhood of origin, then it is practically stabilizable in probability by means of a family of smooth feedback  = {  ()}.
Proof.The proof of this proposition is a direct consequence of Theorems 7(i) and 12 and Remark 13.Indeed, suppose that the SCS ( 7) is asymptotically stabilizable in probability, then according to Theorem 7(i) the SCS ( 7) satisfies the stochastic Lyapunov condition at the origin.Therefore, by Remark 13, the SCS (7) satisfies the stochastic practical Lyapunov condition at the origin.Hence, the desired practically stability in probability for the SCS ( 7) is assured by Theorem 12.
In the following, we will customize Theorem 14 for a particular case of SCS (27).Assume, without loss of generality that  2 = 0.In this case, the SCS (27) becomes where is an   2 -valued control law, and  and  are constant function.The following proposition, which is an immediate consequence of Theorem 14 and is also an extension of Proposition 2.3 established in [14], provides a CLF for the SCS (48) that satisfies the small control property.In the proof of this proposition we use the stochastic version of converse Lyapunov theorem established in [15].Proposition 16.Consider the SCS (48) and suppose that (28) is asymptotically stabilizable in probability by means of the feedback law  = ( 1 ), (0) = 0, which is continuously differentiable in a neighborhood of the origin.Then (48) satisfies the stochastic Lyapunov condition and the corresponding CLF satisfies the small control property.
Proof.Since the equilibrium solution  1 ≡ 0 for the closedloop system is asymptotically stable, then the converse Lyapunov theorem of Kushner [15] asserts that there exists a smooth Lyapunov function  :   1 →  defined in a neighborhood  of the origin in   1 such that Obviously,  is nonnegative definite and (/ 2 ) is continuously differentiable in a neighborhood of the origin.If we define the functional  on   by the following: then where  is uniformly bounded on   and such that  satisfies the properties in Theorem 14 (i).In particular, (32) holds with  = 1.Therefore, the function Φ =  +  satisfies the small control property.
As in deterministic case (see Tsinias [14]) we can easily establish the following elementary result that is a useful tool in proving some of our results (Theorem 18).Lemma 17.Let  :   →   be a real mapping that satisfies the Lipschitz condition on a compact neighborhood  of the origin in   , and (0) = 0. Then there exist a neighborhood  1 of the origin in   , a constant  > 0, and a family of smooth mappings   :  1 →   ,  > 0 such that for every sufficiently small  the following conditions hold: The following theorem is an extension of Theorem 2.4 established in [14] and provides a control Lyapunov family for the SCS (48) that depends directly on the dynamics of the SCS (33).The proof of this theorem is the stochastic analogue of the deterministic case of Tsinias [14] where the author used the converse Lyapunov theorem given in Massera [21,22] to establish his result.In the proof of the following theorem we use the stochastic version of converse Lyapunov theorem established in Kushner [15].Theorem 18. Suppose that the SCS (33) is asymptotically stabilizable in probability by means of the feedback law  = ( 1 ) with (0) = 0.
(i) If  is continuous at the origin, then SCS (48) satisfies the practical Lyapunov condition.
(ii) If  is Lipschitz continuous, then the SCS (48) satisfies the practical Lyapunov condition and, additionally, the corresponding control Lyapunov family satisfies the stochastic small control property.
Proof.We prove only statement (ii) of this theorem.The proof of the first part of the theorem can be obtained using the similar argument and is therefore omitted.Suppose that the SCS (33) is stabilizable in probability by means of the feedback law  = ( 1 ) and (0) = 0. Then the converse Lyapunov theorem proved by Kushner [15] asserts that there exists a smooth Lyapunov function  defined on a compact neighborhood  of   1 and a positive definite and strictly increasing continuous function  :  + →  + such that (0) = 0 and for any  1 ∈ .Let  > 0 and  1 ⊂  as defined in Lemma 17, where  is the above Lipschitzian stabilizer that is defined on  and it takes values on   2 .Consider the positive constants  1 and  2 such that ‖( 1 )‖ <  1 for  1 ∈  1 and ‖∇ 1 ()‖ <  2 for every  belonging to For any  > 0, such that the sphere (0, ) is contained to , consider a positive  = () < 1 where is contained to (0, ).Since (0) = 0 and ℎ(0) = 0, there exist nonnegative constants Denoting D 0 as the infinitesimal generator of the stochastic process, the solution of the resulting closed-loop system is Mathematical Problems in Engineering deduced from the SCS (48) with the function   () given by (65).This yields

Applications
In this section, we illustrate our results by giving three numerical examples.
where (  ) ≥0 is a standard real-valued Wiener process and  is a real-valued measurable control law.Consider the following Lyapunov function candidate: After some simple calculations, it gives From (72), we get Therefore, condition ( 15) is satisfied with () = , and, thus, the function Φ() is a CLF for the SCS (70).By Theorems 7 and 14(i), the SCS (70) is ASP by means of a feedback law which is smooth in a neighborhood of the origin in  2 , except possibly at the origin.
Finally, we may invoke Theorem 8 to find an explicit formula for a feedback law.Indeed, by (20) and (72) we can obtain the following feedback law: that guarantees the equilibrium solution of the resulting closed-loop system is deduced from the SCS (70) which satisfies the ASP property.
Example 21.Let () ∈  2 be the solution of the SCS as where (  ) ≥0 is a standard real-valued Wiener process,  is a real-valued measurable control law, and  2 ( 1 ) is a smooth functional mapping from  2 into  such that for any  1 ∈ ,  1 ̸ = 0. Obviously, the SCS (75) has the form (26), and the function Φ, defined on   as is a CLF for SCS (75).Indeed, for any  ̸ = 0 with where (  ) ≥0 is a standard real-valued Wiener process and  is a real-valued measurable control law.Obviously, the SCS (82) has the form of (48), and the stochastic system is stabilized by the continuous law  = −(2 1 ) 1/3 .Therefore, according to Theorem 19(i), the SCS (82) is practically stabilizable in probability at the origin by means of a family of smooth feedback laws.

Conclusion
In this paper, we have studied the problem of asymptotical and practical stabilization in probability of SCS when both drift and diffusion terms are affine in the control.We have used stochastic version of Artstein theorem and extended the asymptotical and practical stabilization results proved by Tsinias [14] for a larger class of SCSs driven by Wiener process.Moreover, the sufficient conditions for the existence of CLF and control Lyapunov family that play a leading role in the existence of stabilizing feedback laws are derived.Several numerical examples are given to validate our results.Finally, we can make the following summaries on our main results.
(1) The stability results provided in this paper are an extension of deterministic results stated in Tsinias [14].Indeed, we obtained a computable stabilizing feedback law under Propositions 15 and 16 and Theorems 18 and 19 that are extension of Propositions 1.5 and 2.3 and Theorems 2.4 and 2.6, respectively, proved by Tsinias [14] for deterministic control systems to the larger class of SCSs driven by a Wiener process.
(3) The stabilizability results provided for stochastic control systems in [6,12,13] do not permit us to make a conclusion about practical stability in probability (Theorem 18), whereas the results of this paper are still valid.

Example 20 .
Denote by () ∈  3 the solution of the following SCS: Assume that for any  in a neighborhood of the origin in   , () ≥ 0 and  2 ⊂  1 =  and the SCS (28) is ASP by means of the feedback law  = ( 1 ).
(i) If  is continuous at the origin, then the SCS (48) is practically stabilizable in probability at the origin.(ii)If  is Lipschitz continuous, then then SCS (48) is practically stabilizable in probability at the origin, where the corresponding control Lyapunov family satisfies the stochastic small control property.(iii) Using condition  1  2 ( 1 ) > 0, the latter equality implies GΦ() < 0. Thus, the function Φ is a CLF for SCS (75).In fact, the feedback law  = − 2 ( 1 ) asymptotically stabilizes 1 = (− 1 + )  +  1   ,(81)at zero.Therefore, by Theorems 7 and 14(i), system (75) is ASP by means of a feedback law which is smooth in a neighborhood of the origin in  2 , except possibly at the origin.