Image jitters occur in the video of the autonomous robot moving on bricks road, which will reduce robot operation precision based on vision. In order to compensate the image jitters, the affine transformation kinematics were established for obtaining the six image motion parameters. The feature point pair detecting method was designed based on Eigen-value of the feature windows gradient matrix, and the motion parameters equation was solved using the least square method and the matching point pairs got based on the optical flow. The condition number of coefficient matrix was proposed to quantificationally analyse the effect of matching errors on parameters solving errors. Kalman filter was adopted to smooth image motion parameters. Computing cases show that more point pairs are beneficial for getting more precise motion parameters. The integrated jitters compensation software was developed with feature points detecting in subwindow. And practical experiments were conducted on two mobile robots. Results show that the compensation costing time is less than frame sample time and Kalman filter is valid for robot vision jitters compensation.
Computer vision is the most important sensor of intelligent moving robots. In real environment, the surface evenness always causes the camera jitters to affect the precision of operation. Electronic vision stabilization has been widely used in the autonomous robot vision [
The rest of this paper is organized as follows. Firstly, an image kinematics model is established and the feature points detecting and matching methods are designed based on the gradient matrix Eigen-value and the optical flow, and image motion parameters solving method is given in Section
Coordination of the pixel
The feature window is defined as a
Using
This paper adopts
When robot is moving in the continuous surface, the adjacent points have homothetic motions, constant brightness, and a tiny small motion in continuous time.
The frames constraint equation can be transformed using Taylor formula [
We establish (
The aim of kinematics parameters solving is to get the image motion parameters
When
Using
According to the coordinate relations of the matching feature points,
Due to the jitters, the motion parameters
In motion compensation process, (
The motion parameters are the basics of compensation, so the parameters solving errors will obviously affect jitters compensation. For the purpose of errors analysis, we adopted the condition number of matrix to quantificationally illustrate effect of the feature points quantity on parameters solving errors.
In mathematics, the matrix condition number is defined as the product of matrix norm and its inverse matrix norm and expresses the sensitivity of matrix calculation to error.
The condition number of the matrix
Following analysis is about two solution stages of the compensation: images motion kinematics parameters solving using (
In view of (
Firstly, 3 pairs of the feature points,
Then designedly add 1 pixel error in vertical direction to the 3rd feature matching points, such as
Similarly, 30 pairs of the feature point, including the above 3 pairs, were used for solving (
Kinematics parameters errors analysis of 3 to 30 pairs feature points.
|
Kinematics parameters | Max. errors | Ideal kinematics parameters | |
---|---|---|---|---|
3 pairs | 1563.2 | (1, 0, 0, |
2.0286 | (1, 0, 0, 0, 1, 3) |
30 pairs | 347.5 | (1, 0, 0, |
0.2713 |
The bold numbers refer to the parameters with errors.
In view of (
Corresponding point errors analysis of 3 to 30 pairs points.
|
Corresponding point | Max. errors | Given point | Ideal corresponding point | |
---|---|---|---|---|---|
3 pairs | 13.5 | (102, |
16.01 | (102, 300) | (102, 303) |
30 pairs | 12.6 | (102, |
0.25 |
The bold numbers refer to the parameters with errors.
Table
Experiments were conducted on two autonomous moving robots. Robot.1 (large) is the Voyager-IIA autonomous robot made in China. Robot.2 (small) is the X80-H robot made in Canada. Robot.1 has many sensors such as vision camera, ultrasonic, infrared ray, and gyroscope. Robot.2 is equipped with wireless communication equipment. The physical experiment scene is shown in Figure
Two mobile robots used for experiments.
The two autonomous moving robots are controlled by the personal computer (PC) through wireless network. Autonomous navigation software on PC controls motion of the autonomous mobile robot, such as move forward, turn back, speed up, and slow down. The CMOS camera is fixed on Robot.1 and connected with PC by USB line, and it transfers the real-time images to PC.
Software was developed using the Visual C++6.0 programming language on the Windows XP operating system. And the central processing unit (CPU) is an Intel Core2Duo 2 GHz system with 1 GB of RAM. The whole software is composed of three parts, the control software of Robot.1, the control software of Robot.2, and the jitters compensation software. Video was captured based on DirectShow. After the image stabilization, the smooth video is displayed on the screen of PC. The compensation software procedure is illustrated in Figure
The compensation software.
Software procedure
Software interface
The video sampling frequency in the mobile robot moving is 20 Hz; namely, the time interval of the adjacent frame is 50 ms. All jitters compensation time was tested through GetTickCount() and cvGetTickFrequency() functions provided by LIB files. And test result is about 24 ms, greatly less than 50 ms. So the proposed jitters compensation algorithm is real-time.
The feature point detecting algorithm based on the gradient matrix Eigen-value always gets the feature points collected on the some objects. In order to uniformly distribute the feature points and accelerate the detecting speed, we divided whole image into many nonoverlapping domains with
The comparisons of feature points detecting in whole window and subwindow.
Result of whole window scans
Result of subwindow scan
Feature points may concentrate on some objects in Figure
Two robots moved linearly forward, respectively, apart by about 1.5 m. There is the same size blocks paved on robot moving road. The length and width of blocks are 19 cm and 9.4 cm, respectively, and slot between blocks is of 0.7 cm width and depth 0.3 cm. Robot.2 is forward, while Robot.1 is behind and its motion velocity is more than that of Robot.2. So Robot.1 is continuously getting closer to Robot.2. The test time of jitters compensation is 16 s.
The process state variance of Kalman filter has key effect on parameters smooth degree of intended movement. Meanwhile, the observed variance of Kalman filter decides the changeability of unintended jitters movement. If observed variance is zero, it will cause no motion compensation effect. So the process state variance and observed variance values must be set according to intended movement and jitters motion quantity, respectively. The initial process state variance in Kalman filter is
The filter results of matrix
Mean square errors (MSE) comparisons between before and after filter are shown in Table
The comparisons of matrix
|
|
|
|
|
| |
---|---|---|---|---|---|---|
Before | 0.001434 | 0.004646 | 0.003686 | 0.010688 | 1.525818 | 0.348722 |
After | 0.000711 | 0.000649 | 0.000874 | 0.001435 | 0.348722 | 0.267184 |
Figure
The series frames of before and after the images stabilization are as shown in Figure
The comparisons of before and after the image stabilization.
Original series
Series after the image stabilization
Figure
Moving on the bricks seam will cause the video bidirectional shake, so Figure
Based on comparative analysis, the following can be got. The number of feature point pairs has great effect on the parameters solving precision, and this effect can be quantificationally analyzed by the condition number of matrices Subwindow feature point detecting can avoid the feature points gathering on some objects. The visual jitters compensation algorithm based on optical flow and Kalman filter, developed based on PC, USB camera, Microsoft Windows operating system, and VC++, meets the requirements of precision and real-time demand of robot vision.
But the proposed method cannot compensate the migration jitters caused during the exposure time of the camera. Further study will focus on how to make the parameters of Kalman filter adaptively change with the different jitters amplitude and frequency.
The authors declare that there is no conflict of interests regarding the publication of this paper.
This research was funded by a Grant (no. LQ13E050004) from the Natural Science Foundation of Zhejiang province and a Grant (no. 201210076) from the Research Project of General Administration of Quality Supervision, Inspection and Quarantine of China.