^{1}

^{2}

^{1}

^{2}

This paper proposes an image-based visual servo (IBVS) controller for the 3D translational motion of the quadrotor unmanned aerial vehicles (UAV). The main purpose of this paper is to provide asymptotic stability for vision-based tracking control of the quadrotor in the presence of uncertainty in the dynamic model of the system. The aim of the paper also includes the use of flow of image features as the velocity information to compensate for the unreliable linear velocity data measured by accelerometers. For this purpose, the mathematical model of the quadrotor is presented based on the optic flow of image features which provides the possibility of designing a velocity-free IBVS controller with considering the dynamics of the robot. The image features are defined from a suitable combination of perspective image moments without using the model of the object. This property allows the application of the proposed controller in unknown places. The controller is robust with respect to the uncertainties in the translational dynamics of the system associated with the target motion, image depth, and external disturbances. Simulation results and a comparison study are presented which demonstrate the effectiveness of the proposed approach.

Potential applications of robotic systems have motivated the researchers to design new models and develop robust controllers in order to improve their reliabilities. Among the robotic systems, unmanned aerial vehicles (UAV) have received great attention in the last decade. The researches generally involve designing reliable controllers, developing efficient actuators, and using the precise sensors. The sensory system for these vehicles generally includes the global positioning systems (GPSs) and the inertial measurement units (IMUs). This sensory unit provides attitude and angular velocity information reliable for a control process. However, it is difficult to obtain linear velocity information suitable for a tracking task. In addition, the GPSs provide only course position information and are not reliable in indoor environments.

In the last decade, vision sensor is utilized as a complementary sensor to obtain local position of the robots and also to estimate the linear velocity information. It has received a great attention among the researchers in the area of UAV and different applications are developed, including estimation of pose and motion of the vehicle [

Controlling the UAV using visual data started from the late 1990s. Two classic approaches are available for vision-based control of robots which include position-based visual servoing (PBVS) and image-based visual servoing (IBVS). In the first approach, visual information is used to provide the robot with a 3D understanding of its workspace. The application of this method on the aerial robots has been reported in several works including [

For the robots with high-speed maneuvers like aerial robots, the dynamics of the robot should be considered in the design of the vision-based controller. Designing an IBVS controller for underactuated aerial vehicles is more complicated in this case. A solution is given in [

Another problem in designing an IBVS controller for the aerial vehicles is the lack of precise information of the linear velocity. This information is specially important in tracking applications. To overcome this problem, [

In this paper, the authors present an asymptotic tracking dynamic IBVS controller for the 3D translational motion of a quadrotor helicopter. Perspective image moments are considered as image features which are reconstructed on a virtual image plane to provide the possibility of designing a dynamic IBVS controller. These features do not require any information from the model of the target and they can have arbitrary bounded motion. The mathematical model of the system is presented in terms of the optic flow of image features. Therefore, it possible to design a velocity-free IBVS controller. The robust integration of the sign of the error (RISE) method [

The paper is organized as follows. Mathematical equations of the quadrotor aerial vehicle are presented in Section

In this section, first the aerial vehicle of the study is described and then its mathematical model is presented.

Figure

Quadrotor helicopter.

The equations of motion of the quadrotor (with a camera attached to its center) are described by two coordinate frames: the inertial frame

The kinematics of the quadrotor can be expressed by

On the other hand, the dynamics of a 6DOF rigid body in the body-fixed frame are given as follows [

There are two schemes to design a visual servo controller for the quadrotor. In [

Spherical and perspective projections are usually used for vision-based control of aerial vehicles. The authors have proposed a method in [

In this section, a dynamic tracking IBVS controller is presented for the 3D translational motion of the quadrotor helicopter. Using a smooth input, the controller provides an asymptotic stability property and it is robust against the image depth

In the design procedure, it is assumed that the camera frame (attached to the center of projection of the camera) is coincident with the quadrotor body-fixed frame,

To consider the translational dynamics of the robot in designing the IBVS controller, these dynamics should also be written in the virtual frame. Then one has

To regulate the dynamics of the system defined by (

Now, by computing the time derivative of (

The image depth

The independent controller for the yaw angle ensures that

Since the closed-loop system is analyzed through a differential inclusion framework, the following definition is presented.

A vector function

To facilitate the subsequent analysis, filtered tracking errors denoted by

Now, the open-loop tracking error system can be developed by premultiplying (

For open-loop dynamics (

Before presenting the main result of this paper, the following lemma is presented with the proof given in [

Let the auxiliary function

Now, the following theorem for the stability result of the propose controller is stated.

Consider the system dynamics defined by (

Let the auxiliary function

Now, the following Lyapunov function is considered to prove the theorem:

The inequalities in (

The controller (

The controller input

This section provides MATLAB simulations to validate the effectiveness of the developed vision-based controller. In the simulations, the sampling rates for the visual data and for the rest of the system are selected as 20 ms and 10 ms, respectively. The quadrotor is initially considered to be at a hover position having the target in the field of view. The target is assumed to be rectangular and its vertexes are considered as the available visual information to measure image features (

The parameters of the dynamic model of the quadrotor are selected as

The inverse dynamics of the quadrotor [

In the first simulation, the quadrotor’s initial position is assumed to be at

Trajectories of the target points in the virtual image plane are shown in Figure

Simulation 1: trajectories of the target points in the virtual image plane.

Simulation 1: time evolution of the norm of the error signals.

Simulation 1: time evolution of the robot position.

Simulation 1: 3D illustration of the trajectory of the motion of the quadrotor and the moving target.

To show the superiority of the proposed vision-based controller with respect to previous methods, in this section the results are compared with the results of the method proposed in [

Norm of the input vector for this work and [

This paper has proposed an IBVS controller for the translational motion of the quadrotor helicopter, flying on a moving target. The main purpose of this paper is to decrease the final tracking error of the system in the presence of uncertainty in the model of the system. The controller utilizes the RISE method to achieve a smooth control effort. In order to compensate for the unreliable quality of accelerometers for a tracking task, in this paper the dynamics of the system are derived based on the flow of image features. The optic flow can be obtained using the flow of the target points in the image sequence or simply can be computed by numerical derivation in case the visual data are available with high rate. The proposed controller is robust against the parametric and nonparametric uncertainties in the dynamic model of the system. These uncertainties are associated with the motion of the target, unknown depth information of the image, and also unmodelled terms in the translational dynamics. Stability analysis proves that the controller produces an asymptotic tracking performance. Simulation results demonstrate the satisfactory response of the proposed vision-based approach and its advantage over the classic methods.

The future work of this research is devoted to improving the robustness of the system in the presence of uncertainty in the optic flow measurements.

The authors declare that there is no conflict of interests regarding the publication of this paper.