Controllability of Nonlinear Impulsive Stochastic Evolution Systems Driven by Fractional Brownian Motion

The impulsive differential systems are valuable tools in the modelling of many processes in which states are changed abruptly at certain moment of time, involving such fields as engineering, physics, and economics, and so forth; see [1–3]. It is well-known that the evolution differential system theory is a generalization of classical theory. So some partial differential systems can be changed into the abstract evolution systems by using semigroup technique. Then the researchers can easily discuss the properties of the partial differential systems by classical differential theory; for more details one can see [4]. The purpose of this paper is to discuss the controllability of the impulsive stochastic evolution systems driven by fractional Brownian motion as the following form:


Introduction
The impulsive differential systems are valuable tools in the modelling of many processes in which states are changed abruptly at certain moment of time, involving such fields as engineering, physics, and economics, and so forth; see [1][2][3].It is well-known that the evolution differential system theory is a generalization of classical theory.So some partial differential systems can be changed into the abstract evolution systems by using semigroup technique.Then the researchers can easily discuss the properties of the partial differential systems by classical differential theory; for more details one can see [4].
It is well-known that the noise or perturbations of a stochastic system are typically modeled by a Brownian motion, such as Gauss-Markov.This process has independent increments.However, many researchers have found that the standard Brownian motion is not an effective process in modeling through many physical phenomena.A family of process that seems to have wide physical applicability is fractional Brownian motion (fBm).This process was first introduced by Kolmogorov in 1940.Mandelbrot and Van Ness studied the applications of the fBm process soon after.Since then various forms of equations have been studied based on different settings.For example, the case of finite-dimensional equations has been studied by Besalú and Rovira [5], Unterberger [6], and Nguyen [7], and the case of infinite-dimensional equations in a Hilbert space has been considered by Boufoussi and Hajji [8], Caraballo et al. [9], and Ahmed [10].

Mathematical Problems in Engineering
One of the basic qualitative behaviors of a dynamical system is controllability, which was first researched by Kalman [11] in 1963.It means that it is possible to steer a dynamical control system from an arbitrary initial state to an arbitrary final state using the set of admissible controls.Many researchers have paid close attention to the study of the controllability for dynamical systems since then.There are many different methods for dealing with the controllability problems for various types of nonlinear stochastic systems.Subalakshmi and Balachandran [12] studied the approximate controllability of nonlinear stochastic impulsive systems in Hilbert spaces by using Nussbaum's fixed point theorem.In [13], by using stochastic Lyapunov-like approach, sufficient conditions for stochastic -controllability are formulated.Balachandran et al. [14] researched the controllability of semilinear stochastic integrodifferential systems by using the Picard type iteration.By using the contraction mapping principle, Mahmudov and Zorlu studied the controllability [15] for nonlinear stochastic systems.Moreover, there are some researchers discussing the controllability for the stochastic system driven by fractional Brownian motion; for example, see [10,16].
However, the above authors only consider that  is an infinitesimal generator of a strongly continuous semigroup.But in our work (1), () generates an evolution system (, ) on a Hilbert space .If () ≡  and () ≡ , the works [7,8] are the special cases.We only assume that the linear system is completely controllable.By using the Cauchy-Schwarz inequality, Banach fixed point theorem, and so forth, we prove that the nonlinear system is completely controllable.
The rest of this paper is organized as follows.In the next section, we will introduce some useful preliminaries.In Section 3, some sufficient conditions are established to guarantee the existence and uniqueness of mild solutions of system (1).In Section 4, we will study the completely controllability for nonlinear impulsive stochastic evolution systems.Finally, we present an example to illustrate our main results.

Preliminaries
Now we introduce some basic definitions, preliminaries, and notations which are used throughout this paper.
Then for  ≥ 0, its stochastic integral with respect to the fBm    is defined as where  is a Wiener process.
Notice that if then in particular (15) holds, which follows immediately from (13).
The following lemma is obtained as a simple application of Lemma 2.
In the following, let us give some basic properties of the operator ().
Let {() :  ∈ } be a family of linear operators and satisfy the following: ( 1 ) The domain (()) =  of () is dense in  and independent of , and () is a closed linear operator.
( 3 ) For , ,  ∈ , there exist constants  > and 0 <  ≤ 1 such that To establish the framework for our main controllability results, we will introduce the following definitions.
Definition 5 (see [4]).A two-parameter family of bounded linear operators (, ), 0 ≤  ≤  ≤  on  is called an evolution system if the following two conditions are satisfied: The following lemmas are of great importance in the proof of our main result.

Existence Result
In this section, we will give the existence results for system (1).We will assume the following conditions: By using the Banach contraction mapping principle, we will show that the operator  has a unique fixed point.To prove that, we divide the subsequent proof into two steps: Step 1.For any  ∈  2 (Ω, Γ, ), let us show that  → ()() is continuous on  in the  2 (Ω, Γ, )-sense.Let 0 <  <  +  < , here ,  +  ∈  \ { 1 ,  2 , . . .,   }, and  > 0 be sufficiently small.Then we obtain Then, by the strong continuous of (, ) and the Lebesgue's dominated convergence theorem, we know that the right hand of (27) tends to 0 as  → 0. Hence, ()() is continuous on  in the  2 (Ω, Γ, )-sense.
Definition 9. System (1) is said to be completely controllable on the interval  if that is, all the points in  2 (Ω, Γ  , ) can be exactly reached from arbitrary initial condition ℎ and  0 at time .
Theorem 10.Assume that hypotheses ( 1 )-( 5 ) hold.Then the impulsive stochastic system ( 1) is completely controllable on , if Proof.Fix  > 0 and let Q  = (,  2 (Ω, Γ, )) be the Banach space of all functions from  into  2 (Ω, Γ, ), endowed with the supremum norm Let us consider the set We easily know that   is a closed subset of Q  equipped with norm ‖ ⋅ ‖ Q  .By condition ( 5 ), we choose the feedback control function as ( has a fixed point on .
To prove that, we divide the subsequent proof into two steps.