Heterogeneous sensors fusion of a camera and a laser-rangefinder can greatly improve the environment perception ability, and its primary problem is the calibration of depth scan and image information. At first, the mapping relationship among world coordinate system, camera coordinate system, and image plane is discussed, and then the calibration of camera intrinsic parameters is achieved. Moreover, the intrinsic and extrinsic parameters separated calibration is presented for a camera and a laser-rangefinder, and the characteristic identification is adopted by two intersection calibration boards with a certain angel for fusion characters extraction. Furthermore, the particle swarm optimization is proposed for the extrinsic parameters estimation with different objectives, and the Gaussian elimination is utilized for the initial particle swarm. The simulation and real experimental results show that the standard deviation of calibration error in the 21-group experiments is decreased by 10.175%, and it also proves the accuracy and effectiveness of our approaches.
1. Introduction
The ability of environmental perception has been considered as an important functionality for heterogeneous multisensor system [1]; heterogeneous sensors system usually includes camera and laser range finders, which are applied to compensate for their drawbacks of each sensor in order to be more reliable. Among those sensors, a camera has been the most popular sensor for recognizing objects, but the camera is too sensitive to the light and the weather and has some limits to acquire depth information, while the laser range finders can give more accurate depth information [2]. Meanwhile, they are also used as the main sensor for autonomous navigation [3]. Therefore, a laser- rangefinder and a camera have different abilities to capture the surrounding information, if these abilities of the two sensors are combined to improve the environment perception ability for multisensor system. However, one of the major problems for heterogeneous sensors fusion is to match the data from these different sensors [4]. The information fusion of two sensors requires knowing in advance the relative pose between the camera and the laser-rangefinder. Thus our paper addresses the intrinsic parameters and extrinsic parameters separated calibration, and then the extrinsic parameters are estimated by the particle swarm optimization (PSO) to decrease the calibration error.
The number of published works on the intrinsic and extrinsic parameters separated calibration of a camera and laser-rangefinder is relatively little. Meanwhile, the particle swarm algorithm that is adopted to optimize the extrinsic parameters in this paper is also novel. As we all knew that the most classic calibration method was proposed by Zhang and Pless [5], it described a practical procedure where a check board pattern was freely moved in front of the two sensors; it was one of the successful and valid algorithms, so we utilized a part of Zhang’s algorithm for intrinsic parameters calibration in this paper. As for the extrinsic parameters calibration, it was achieved by freely moving a check board pattern in order to obtain plane poses in camera coordinates and depth information in the LRF reference frame [6]. Meanwhile, an external parameter calibration method for multiple cameras with nonoverlapping fields of view using a laser-rangefinder (LRF) was presented in [7]. And a modular approach had been extensively tested during VIAC which had offered a unique chance to face pros and cons of different calibration procedures in [8]. Furthermore, the original nonlinear calibration model of multiple LIDARs was reformulated as a second-order cone program (SOCP) on a mobile vehicle platform in [9], the nonlinear distortion of the camera was considered, and the calibration parameters were determined with the least square error in [10]. However, the intrinsic parameters and extrinsic parameters mixed as a kind of parameters matrix to calibrate at the same time, which led the space of solution to become larger; meanwhile, the errors of parameters estimation are increased. Therefore, the camera’s intrinsic parameters and extrinsic parameters were separated in order to improve the accuracy of calibration [11]; nonlinear least square and nonlinear Gauss-Newton method are utilized to optimize parameters. However, the performance of parameters optimization is limited by the algorithm. Furthermore, a novel laser-rangefinder calibration method was proposed by using genetic algorithm to overcome the problem [12] that the conventional camera calibration methods cannot correct the misalignment of this rangefinder. The fitness function estimated the difference between the actual image outputs and the calculated image outputs. Therefore, we proposed particle swarm optimization (PSO) to optimize the extrinsic parameters for heterogeneous sensors calibration of laser-rangefinder and camera in this paper; at the same time, the Gaussian elimination is utilized to initialize the particle swarm, and we improve the fitness function according to decreasing the calibration error. Therefore, not only is it effective in calibrating the separated parameters based on PSO, but also it can decrease the error of calibration.
The rest of paper is organized as follows. Section 2 describes the coordinate transformations from laser-rangefinder to optical image plane. In Section 3, we introduce how to calibrate the intrinsic parameters and the extrinsic parameters separated calibration method. Then the extrinsic parameters estimation and optimization based on PSO are designed in Section 4. Section 5 exhibits some comparison experimental analysis of the laser-rangefinder and the camera fusion calibration based on PSO. At last, Section 6 provides the conclusion for this paper.
2. Coordinate Transformations from Laser-Rangefinder to Optical Image Plane
The hardware of information fusion platform mainly consists of a laser-rangefinder and a camera. Here the type of laser-rangefinder is SICK LMS291 [13], which is noncontact measurement systems, and scans their surroundings two-dimensionally. We select horizontal angle field of 180°, the interval angle is 1°, and transmission rate is set as 500 Kbps. The data structure of laser-rangefinder is a kind of 1 × 181 dimensional matrix array, so a median filter is utilized to remove the noise data. Moreover, the camera is FFMV-03MTC-CS, using 1394 transmission mode and its resolution of 640*480, because the camera of IEEE 1394 bus can be satisfied for the demand of real time.
The observed objects captured by laser-rangefinder are distance information of a depth plane in the space, while the objects collected by camera are optical information according to the optical principle. Due to the heterogeneity of data acquisition, the images of camera and the datum of laser- rangefinder are heterogeneous. Hence, it is significant to integrate those heterogeneous data into the same coordinate system for information fusion. Therefore, there are two main steps to map laser- rangefinder information to optical images plane through coordinate transformation [14]. Firstly, we need to obtain a relatively accurate transformation matrix according to ideal physical model, which can ensure data captured by laser-rangefinder mapping into optical image coordinate system. Secondly, since image coordinates of each pixel are discrete, it is necessary to utilize gray-scale interpolation method for each coordinate transformation after spatial coordinate mapping, which can make them fall on the exact pixel points even when the coordinates of four surrounding points are not integer.
2.1. Spatial Coordinate Transformation
Here laser-rangefinder coordinates system is f(xt,yt,zt), g(u,v) is the coordinates of the optical image plane, and the target object coordinates in the world coordinate system are h(xc,yc,zc). The most widely used model of camera is the typical pinhole model of camera [15]. The equation of the model is
(1)[uv1]=fzc·[ku0u0f0kvv0f001f]·[xcyczc]=1zc·[fu0u00fvv0001]·[xcyczc],
where u,v represent coordinates of the target point in the image plane and xc,yc,zc are the world coordinates of object; here, they are also considered as a point in camera coordinate system. ku,kv are the scale factors along the axes of pixel coordinates, which are defined as adjacent pixels physical distance ratios of horizontal and vertical individually in the images. uo,vo are the coordinates of image center in pixels, and f means focal length. The orthonormal rotation matrix [w]=f(θ1,θ2,θ3) and translation vector [T]=(t1,t2,t3)T are combined for transforming from laser-rangefinder depth information to the world (or camera) coordinate; we obtain conversion expression (2) according to coordinate transformation rules:
(2)[xcyczc]=[w11w12w13w21w22w23w31w32w33]·[xtytzt]+[t1t2t3].
So the laser-depth coordinates are transformed to the optical image coordinates and we gain
(3)[zcuzcvzc]=[fu0u00fvv0001]·[w11w12w13w21w22w23w31w32w33]·[xtytzt]+[fu0u00fvv0001]·[t1t2t3].
According to (3), the world coordinates of the object are denoted by zc=w31xt+w32yt+w33zt+t3, so the points of laser-rangefinder mapping to optical images are completed successfully.
2.2. Gray-Scale Interpolation for Information Fusion
Since the coordinates of each pixel are discrete, it may fall on the noninteger pixel after coordinate transformation. The interpolation method is needed to use after each coordinate transformation in order to fall on the exact pixel points. Hence, we use neighbor interpolation method to realize gray- scale interpolation, as is depicted in Figure 1. Here, if the laser-rangefinder transformed coordinates are not integers, the four-pixel coordinates that are surrounding the laser-rangefinder mapping point after spatial coordinate transformations are captured firstly. Secondly, the distance between the lasers mapping point and these four surrounding pixels is computed. Thirdly, the original laser-rangefinder point coordinate is substituted by the minimum distance mapping pixel point coordinate.
Flowchart of near interpolation method.
2.3. The Calibration Parameters Analysis
The camera parameters are classified into the intrinsic parameters and extrinsic parameters. Generally, the inherent characteristics and properties of the camera are determined by its intrinsic parameters, since they are not going to change for the same camera. In other words, for a camera, if the focal length or the mechanical structure keeps invariability, its intrinsic parameters are fixed. However, the extrinsic parameters represent the pose and orientation information of the camera in the world coordinate system. Therefore, the extrinsic parameters can be denoted by the orthonormal rotation matrix and translation vector. Wherefore, it is necessary to measure intrinsic parameters and extrinsic parameters of the camera separately; especially when position and orientation of the target objects are restored from the optical image to spatial coordinate, the process is called calibration for a camera and a laser-rangefinder. Thus intrinsic and extrinsic parameters of the camera are indispensable for data fusion [16].
From Figure 2, (Xc,Yc,Zc) are the coordinates of Pc on the scene plane in the world system, the line between Pc and the camera interacts with the optical plane on PA, it is an image ideal point, and its coordinates are called (XA,YA,fA). Meanwhile (XD,YD,ZD) are the coordinates of an actual image point PD mapping PA in camera image plane. Rotation matrix RD and displacement translation vector TD are used to describe the coordinate transformation [17] as
(4)[XDYDZD1]=[RDTD01]·[XcYcZc1]=D·[XcYcZc1].
Here, D is called camera extrinsic parameters matrix, which is determined by the pose and orientation of the camera in the world coordinate system. Due to the distortion of the optical image, the change of focal length, and optical path’s centerline, PA and PD are not coincided. Consequently, it is necessary to define a transformation matrix to indicate the relative position between them. Assuming that dx,dy are the distances of two adjacent pixels of the optical image in x-axis direction and y-axis direction and (uA,vA) are the coordinates on the optical image, (uo,vo) are the intersection coordinates of optical path’s centerline on the optical image plane, setting each pixel as a unit. According to the principle of pinhole imaging, we obtain
(5)XA=fA·XDZD,YA=fA·YDZD,uA=XAdxA+uo,vA=YAdyA+vo.
Further, we find the transformation relationship of the camera coordinate system mapping to the optical image coordinate system according to
(6)[uAvA1]=fAZD·[fAdxA0u0fA00fAdyAv0fA0001fA00010]·[XDYDZD1]=MA·[XDYDZD1],
where MA represents the camera’s intrinsic parameters matrix, which is determined by its inherent characteristics. Consequently, the relationship between the world coordinate system and the optical image coordinate system can be expressed as (7) according to (4) and (6):
(7)ZD·(uA,vA,1)T=MA·D·(XC,YC,ZC,1)T.
Here, D is the camera extrinsic parameters matrix, MA is the camera’s intrinsic parameters matrix, (uA,vA) are the coordinates on the optical image, (XC,YC,ZC) are the coordinates of Pc, and ZD is the coordinate of PD as shown in (4).
A camera spatial model for calibration.
3. The Calibration and Analysis of Camera Intrinsic Parameters
The camera calibration algorithm [5] is utilized to estimate its intrinsic parameters. Therefore, the intrinsic parameters are substituted into the next spatial coordinate transformation as known parameters for heterogeneous data fusion. In this paper, Zhang’s algorithm is utilized for camera’s intrinsic parameters calibration partly. Meanwhile, the “Camera Calibration Toolbox for Matlab” of Jean-Yves Bouguet is adopted [18]. Firstly, planar checkerboard is utilized as the camera calibration board, and its grid side is 30 mm. Secondly, multiple angles calibration images are collected; thus it is unnecessary to fix positions and orientations for the camera’s intrinsic parameters calibration. Thirdly, in a clockwise direction, starting from the top left corner, initial values of each corner point are set through the ratio of selection box, as is shown in Figure 3; additionally, the side length of each small grid should be given before the calculation (e.g., 30 mm). The system supposes the corner search range as five pixels. Fourthly, inputting corner point information into Zhang’s parameters calibration tools, we can acquire intrinsic parameters of the camera. Meanwhile, Camera Calibration Toolbox for Matlab [18] is utilized; it can also output extrinsic parameters of the camera in the three-dimensional coordinates. After calculation, we obtain intrinsic parameters as
(8)KK=[fu0u00fvv0001]=[755.490353.050754.57246.65001].
The calibration of a camera and a laser-rangefinder is also considered as an optimization problem, which is regarded to minimize the distance between the features from camera measured objects and their actual position. After calibration, as any points of the world coordinate system, we can connect it with the optical center by a straight line, and then this line will intersect with the optical plane. Therefore its precise coordinates on the optical image can be located by this intersection point. The above is significant and crucial for data fusion of the laser-rangefinder and the camera, which affects the fusion precision and efficiency closely. Here, the coordinate transform formula (9) is gained according to (3):(9)[zcuzcvzc]=[(fuw11+u0w31)xt+(fuw12+u0w32)yt+(fuw13+u0w33)zt(fvw21+v0w31)xt+(fvw22+v0w32)yt+(fvw23+v0w33)ztw31xt+w32yt+w33zt]+[fut1+u0t3fvt2+v0t3t3]=[(fuw11+u0w31)xt+(fuw12+u0w32)yt+(fuw13+u0w33)zt+fut1+u0t3(fvw21+v0w31)xt+(fvw22+v0w32)yt+(fvw23+v0w33)zt+fvt2+v0t3w31xt+w32yt+w33zt+t3].
Here, laser-rangefinder data are denoted in polar coordinates. As for the fusion platform, a camera is fixed to the top of the laser-rangefinder; the Cartesian coordinates of laser-rangefinder points are described as
(10)xt=ρt·cosθyt=ρt·sinθzt=0,
where ρt is the measure distance of laser-rangefinder and θ is the scan angle of laser-rangefinder. Then the rotation matrix [w] is represented by the combination of rotation amounts along y-axis direction and x-axis direction, so w31=w32=w33=0 if we define the coordinate transform formula equation (9) as the form of
(11)[zcuzcvzc]=[a0xt+b0yt+c0a1xt+b1yt+c1a2xt+b2yt+c3].
In (11), a0=fuw11, b0=fuw12, c0=fut1+u0t3, a1=fvw21, b1=fvw22, c1=fvt2+v0t3, a2=b2=0, and c3=t3, so the parameters to be determined of (ai,bi,ci) are nonlinear. Here (xt,yt) are the discrete coordinates of laser-rangefinder; (u,v) are the optical image coordinates. Meanwhile, assuming each point (u,v) belonging to the line separating of the green plank and the white plank (shown in Figure 4), which satisfies the characteristic linear equation of (12), it is also the intersection line of two discriminating planes:
(12)Au+Bv=1,
where A, B are characteristic line parameters; the intrinsic and extrinsic parameters separated estimation is proposed and the separated estimation equation of the camera’s intrinsic and extrinsic parameters is gained as (13) according to (9) and (12):
(13)Afuxtw11+Afuytw12+Bfvxtw21+Bfvytw22+(Au0+Bv0-1)xtw31+(Au0+Bv0-1)ytw32+Afut1+Bfvt2+(Au0+Bv0-1)t3=0.
Extraction of feature line.
According to (13), the separated calibration of intrinsic and extrinsic parameters are required to solve these 13 unknown parameters (fu, fv, u0, v0, w11, w12, w13, w21, w22, w23, t1, t2, t3). By above analysis, we can gain w31=w32=w33=0 according to (10) here, fu,fv,u0, and v0 are the intrinsic parameters, which are gained from (8). So only w11,w12,w13,w21,w22,w23,t1,t2, and t3 extrinsic parameters are left to be estimated.
4.1. The Characteristic Parameters (A,B,xt,yt) Identification
Additionally, (A,B,xt,yt) are also seen as the known parameters in (13), because they can be calculated by characteristic line and characteristic points on the separating intersection line as (12). Here, we identify those characteristic parameters through any two characteristic points, which are extracted on the characteristic line of Au+Bv=1, so A and B characteristic parameters can be determined. Meanwhile, as for laser-rangefinder data, (xt,yt) are considered as the maximum curvature points in the intersection line of scanning plane and calibration plate, which need to fall in the line Au+Bv=1. Therefore, the maximum curvature object point is extracted from a series of points on the intersection with calibration plate; then (xt,yt) are gained.
Moreover, lots of experimental data are needed to substitute into (13) to solve and estimate the other parameters. These various effective experimental data are obtained by altering relative pose between the objects and camera under different experimental surroundings, for instance, changing the inclination of the object or adjusting the distance between the object and the camera. Significantly, it has to be guaranteed that laser-rangefinder and optical image data are collected synchronously.
4.2. Extrinsic Parameters Separated Calibration Based on Particle Swarm Optimization
In fact, w11,w12,w13,w21,w22,w23,t1,t2, and t3 extrinsic parameters can be estimated by 9 equations from (13) which are designed by 9 different scene experiments after the fu,fv,u0, and v0 and A,B,xt, and yt are all solved. As we all know that Gaussian elimination (also known as row reduction) is an algorithm for solving systems of those linear equations, in this paper, Gaussian elimination is utilized to choose proper initial solution for the particle swarm. However, the initial solution is not the best solution to estimate the extrinsic parameters which may bring into lots of calibration error. Therefore, it is beneficial for the particle swarm optimization algorithm to improve extrinsic parameters estimation performance.
Furthermore, the extrinsic parameters estimation is also considered as the parameters optimization process; we can infer (14) from (13); that is to say, some measurement noises are added into the coefficient matrix, which may also satisfy the transform of (13); a random noise ξ is added into (13); it satisfies nonzero solutions as the following (14); if ξ tends to be zero infinitely, thus (14) is similar equivalent to (13):
(14)Afuxtw11+Afuytw12+Bfvxtw21+Bfvytw22+(Au0+Bv0-1)xtw31+(Au0+Bv0-1)ytw32+Afut1+Bfvt2+(Au0+Bv0-1)t3=ξ⟶0.
Therefore, the calibration of a camera and a laser-rangefinder is also considered as an optimization problem. In addition, there are some objectives for the extrinsic parameters calibration; the major objective is to minimize the distance between the features of camera measured objects and their actual position. Therefore, the major objective of extrinsic parameters calibration is divided into two optimization subobjectives; one minimized the sum of squared error as is shown in
(15)F1=min{∑i=1n|aiu+biv-1|2}.
The other minimized the sum of distances from the points to the lines of Au+Bv=1; (16) is the other objective:
(16)F2=min{∑i=1n|aiu+biv-1|ai2+bi2}.
Here,
(17)u=[(fuw11+u0w31)xti+(fuw12+u0w32)yti+fut1+u0t3](w31xti+w32yti+t3),v=[(fvw21+v0w31)xti+(fvw22+v0w32)yti+fvt2+v0t3](w31xti+w32yti+t3).
In this paper, the particle swarm optimization is proposed for the extrinsic parameters calibration, so we suppose that the search space is D-dimensional; here D=9, that means there are 9 extrinsic parameters to be estimated; then the ith particle of the swarm can be represented by a D-dimensional vector Xi=(xi1,xi2,…,xiD)T. The velocity of the particle can be represented by another D-dimensional vector Vi=(vi1,vi2,…,viD)T. The best previously visited position of the ith particle is denoted by Pi=(pi1,pi2,…,piD)T, defining g as the index of the best particle in the swarm. The swarm is manipulated according to the following:
(18)vidn+1=ωvidn+c1r1n(pidn-xidn)+c2r2n(pgdn-xidn),(19)xidn+1=xidn+vidn+1,
where d=1,2,…,D,i=1,2,…,N,N is the size of the swarm, w is called inertia weight, C1 and C2 are two positive constants, called cognitive and social parameter, respectively, assuming C1=C1=2, and w,r1nr2n are all random generator number between 0 and 1. Two variants of the PSO algorithm were developed, one with a global neighborhood and one with a local neighborhood. According to the global variant, each particle moves towards its best previous position and towards the best particle in the whole swarm. On the other hand, according to the local variant, each particle also moves towards its best previous position and towards the best particle in its restricted neighborhood. Meanwhile, we set the evaluation function F=αF1+(1-α)F2, and we utilize the Gaussian elimination method to solve the extrinsic parameters and initialize particle swarm initialization; some details are shown in Algorithm 1, and the experiments are illustrated in Section 5.3.
Algorithm 1: The summary of the PSO algorithm.
(1) Initialize.
(1.1) Generate an initial swarm of N particles at random by Gaussian elimination.
(1.2) Generate initial velocities vid, 1≤i≤N and 1≤d≤D, at random.
(2) Repeat until a given maximal number of iterations is achieved.
(2.1) Evaluate the fitness of each particle using (20).
(2.2) Determine the best vector pbest visited so far by each particle.
(2.3) Determine the best vector gbest visited so far by the whole swarm.
(2.4) Update velocities vid using (18) according to the factor model.
(2.5) Update particle vectors xid using (19).
5. Data Fusion Experiment Analysis for the Calibration of Laser- Rangefinder and Camera5.1. Extraction Characteristic Parameters A,B of Au+Bv=1
According to the characteristic line Au+Bv=1 of camera images, which is the intersection line of two calibration boards shown in Figure 4, the characteristic line is able to be extracted through selecting any two points manually, which are not the same points on the image, and those two points can determine the characteristic line; then the characteristic parameters of A,B can be calculated.
5.2. Feature Point Pt(xt,yt) Extraction Based on Laser-Rangefinder
Before extracting feature point Pt(xt,yt), the observational data in the intersection line of laser- rangefinder scanning plane and calibration plate are needed to collect firstly. Moreover, the laser-rangefinder data of calibration board should be shown as “arrow” pattern, because the pattern can be set manually. Furthermore, the two edges straight lines of the arrow-shaped pattern are extracted; then the intersection of the two straight lines can be calculated to obtain Pt(xt,yt). As is shown in Figure 5, the two-line intersection is the object feature point Pt(xt,yt).
Feature extraction of laser-rangefinder.
Feature extraction
Feature points extraction for one line
Feature points extraction for the other lines
5.3. PSO for the Extrinsic Parameters Separated Calibration and Optimization
Generally, extrinsic parameters can be calibrated by 9-group different scene experiments. Here we design 21-group different scene fusion experiments to collect data for the extrinsic parameters calibration and optimization; thus there are C219 kinds of equations to solve the extrinsic parameters by Gaussian elimination for the particle swarm initialization, and here we select randomly 13 kinds of solutions to initialize the particle swarm; that is to say, the number of particles is set as 13, and here the threshold value α of evaluation function is 0.5, and we hope that the extrinsic parameters not only should fit for a special experiment but also should adapt to a majority of fusion experiments. Therefore, we set the standard deviation of evaluation function F for the 21-group fusion experiments as the fitness function s to optimize the extrinsic parameters, as is shown in
(20)si=(1n∑i=1n(pi-p-)2)1/2,(p-=1n∑i=1npi,i=1,2,…,n,n=21).
Here pi=αF1i+(1-α)F2i, α=0.5, F1i is the ith scene fusion experiment for the sum of squared error of (15), and F2i is the ith scene fusion experiment for the distance error of (16). So si is the standard deviation of the 21-group fusion experiments. Then we do ten times independent experiments for the statistic performances of the PSO, and those 21-group different scenes are all tested in each independent experiment. Here Table 1 showed the standard deviations of evaluation function F for the 21-group different scene in each independent experiment; the best extrinsic parameters calibration results are in the 10th experiment, the standard deviation of fitness function s is only 0.2981, and the average of the fitness function s is 0.3381, which is better than the nonlinear least square and nonlinear Gauss-Newton optimization methods for different constraints in [11]. Specifically, extrinsic parameters calibration is optimized by the nonlinear least square method for the first constraint (15); then these parameters are re-optimized by the nonlinear Gauss-Newton method for the second constraint (16) in [11]. In addition, the nonlinear least square and nonlinear Gauss-Newton methods are both utilized step by step for extrinsic parameters calibration process in [11].
The performance of PSO for extrinsic parameters calibration.
Order
s
w11
w12
w13
w21
w22
w23
t1
t2
t3
Time (s)
1
0.3244
5.3536
-0.1320
-0.2327
1.0309
-0.6892
-5.3457
2.7222
-109.3953
5.3276
0.7237
2
0.3404
2.5309
-0.0624
-0.2304
0.4784
-0.2557
-2.5310
1.3692
-55.4788
-7.1949
0.7175
3
0.3328
1.6392
-0.0385
-0.0641
0.3289
-0.2452
-1.5627
0.5244
-37.1193
-12.9526
0.7059
4
0.3291
-12.4868
0.3137
0.7653
-2.3754
1.7716
12.2181
-5.2497
241.8434
16.5697
0.7237
5
0.3432
-17.5503
0.4231
1.6248
-3.2606
2.3388
17.1477
-1.6669
323.7501
-4.5083
0.7253
6
0.3726
-29.0181
0.6558
3.8494
-5.3153
7.3192
28.4911
1.9166
525.3508
-44.1139
0.7117
7
0.3476
-37.5951
0.9284
3.0366
-7.1762
6.2046
36.1728
-15.2600
730.2179
199.5670
0.7062
8
0.3694
-1.6929
0.0415
0.1844
-0.3119
0.1704
1.6258
-0.5066
30.9968
10.4553
0.7103
9
0.3232
-5.0241
0.1244
0.3380
-0.9736
0.4936
4.8326
-1.7687
101.7259
27.9931
0.7144
10
0.2981
-1.3242
0.0331
0.0620
-0.2632
0.1257
1.2699
-0.6325
29.2756
9.3599
0.7153
Average
0.3381
0.7154
In [11]
0.3764
-2.5745
0.0604
0.0418
-0.4989
0.3808
2.5037
-0.3860
53.7654
5.2954
0.4966
However, the computational cost of particle swarm optimization algorithm is a little heavier than the methods in [11]; certainly, it also meets the requirements for the calibration reliability and real-time. As we all know that the intrinsic parameters are related to the inherent attribute of camera closely, the extrinsic parameters reflect the relative direction and position between the laser-rangefinder and the camera. After intrinsic and extrinsic parameters are calibrated, the sensor fusion process will not change those parameters. In other words, only if the camera is the same camera, the relative direction and position between laser-rangefinder and camera keep unchanged; then those calibration parameters also remain unchanged. Thus, we understand that the performance of calibration real time is not so significant as the performance of calibration accuracy for the calibration because we can calibrate parameters offline before the sensor fusion. In Table 1, the running time of 0.7154 s could be accepted by the sensor fusion process. Meanwhile, the optimization extrinsic parameters of each different experiment are all list in Table 1.
These mixed and separated intrinsic and extrinsic parameters are compared, respectively, for the laser-rangefinder and the camera data fusion. Figure 6 includes three examples of intrinsic and extrinsic parameters mixed method, which shows the apparent deviation errors in the black panes; however, parameters separated method leads to fewer errors, as shown in Figure 7. Meanwhile, the mapping effects of laser-rangefinder points onto the image are more reasonable with actual situation in Figure 7.
Performance of mixed intrinsic and extrinsic parameters fusion method.
The first example
The second example
The third example
Performance of separated intrinsic and extrinsic parameters fusion method.
The first example
The second example
The third example
Additionally, the convergence curve for PSO to optimize the extrinsic parameters in the 10th experiment is given; we conclude that the standard deviation of the 21-group is always decreasing with iteration time increasing according to Figure 8. At last, it converges to 0.2981, which is much lower than the standard deviation from the methods in [11].
The convergence times curve for PSO in the 10th experiment.
Furthermore, the best PSO calibration result of the tenth experiment is 0.2981, which is better than 0.3764. From the performance comparison between the PSO and the method in [11] for the optimization extrinsic parameters, we find that the majority of experiments by PSO for the optimization extrinsic parameters are better than the method in [11], except the 5th, 9th, 10th, 12th, and 15th experiment scene, as is shown in Figure 9, but the total standard deviation 0.2981 of PSO for extrinsic parameters calibration is better than 0.3764 of the method in [11], and the standard deviation of calibration error in the 21-group experiments is decreased by 10.175% compared with the method in [11].
The performance comparison between PSO and [11].
Extrinsic parameters optimization is achieved by PSO; it considers two optimization objectives; the multiobjective calibration process is translated into a comprehensive objective. While the nonlinear least square is utilized for the first objective, the nonlinear Gauss-Newton method is utilized for the second objective in [11]; the theoretical evidence is not very sufficient, so the performance of calibration is not better for the sensor fusion.
At last, the calibration results for 21-group scene in the 10th independent experiment are shown in Figure 10; from those results, we can confirm that extrinsic parameters calibration based on PSO are valid and effective, especially in their segment boundaries.
The calibration results for 21-group scene in 10th independent experiment.
The 1st scene for experiment
The 2nd scene for experiment
The 3rd scene for experiment
The 4th scene for experiment
The 5th scene for experiment
The 6th scene for experiment
The 7th scene for experiment
The 8th scene for experiment
The 9th scene for experiment
The 10th scene for experiment
The 11th scene for experiment
The 12th scene for experiment
The 13th scene for experiment
The 14th scene for experiment
The 15th scene for experiment
The 16th scene for experiment
The 17th scene for experiment
The 18th scene for experiment
The 19th scene for experiment
The 20th scene for experiment
The 21st scene for experiment
6. Conclusion
According to the principle of heterogeneous calibration and the characteristics of a laser- rangefinder and a camera, the mapping relationship among world coordinate system, camera coordinate system, and image plane is discussed. Meanwhile, calibration algorithm takes into account separated intrinsic parameters and extrinsic parameters. Zhang’s algorithm is adopted to calibrate camera’s intrinsic parameters, and then the inherent properties of camera are analyzed. Moreover, the extrinsic parameters separated calibration and estimation based on PSO are proposed to improve the calibration’s accuracy and validity. Meanwhile, we design a characteristic line method to obtain extrinsic parameters estimation by two intersecting calibration boards with a certain angel. Furthermore, we applied PSO to optimize calibration parameters for two different objectives. And then the availability and reliability of a camera and a laser-rangefinder are insured by the calibration parameters separated and extrinsic parameters optimized based on PSO. In summary, the proposed separated parameters calibration and particle swarm optimization method for the camera and the laser-rangefinder in the paper are an improvement to traditional mixed calibration of intrinsic and extrinsic parameters; meanwhile, both the separated parameters calibration and the extrinsic parameters optimized based on PSO algorithm are to decrease the calibration error; furthermore, the experimental results and analysis indicate that the proposed calibration method can insure the accuracy and reliability of the camera and the laser-rangefinder information fusion.
Conflict of Interests
The authors declare that there is no conflict of interests regarding the publication of this paper.
Acknowledgments
This paper was supported by the National Natural Science Foundation of China (Grant no. 61304253), Natural Science Foundation of Hunan (Grant nos. 13JJ4018 and 13JJ4093), and the Doctoral Program of Higher Education of China (Grant no. 20130162120018).
ChenY.ChenZ.WeiS. G.Research on point cloud data segmentation based on extrinsic calibration of laser and CCD20087295297ChenC.ChienH.Geometric calibration of a multi-layer LiDAR system and image sensors using plane-based implicit laser parameters for textured 3-D depth reconstruction201425465966910.1016/j.jvcir.2013.08.0052-s2.0-84883206209YuL.CaiZ.ZhiZ.FengZ.Fault detection and identification for dead reckoning system of mobile robot based on fuzzy logic particle filter20121951249125710.1007/s11771-012-1136-92-s2.0-84862735349JiangL.HaoX.ZhangW.A study on the calibration of pitch angle deviation for airborne Lidar system8731The International Society for Optical Engineering, Laser Radar Technology and Applications XVIIIMay 2013Baltimore, Md, USAProceedings of SPIE10.1117/12.20148122-s2.0-84881121625ZhangQ. L.PlessR.Extrinsic calibration of a camera and laser range finder (improves camera calibration)Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS '04)October 2004Sendai, Japan230123062-s2.0-14044276981VasconcelosF.BarretoJ. P.NunesU.A minimal solution for the extrinsic calibration of a camera and a laser-rangefinder201234112097210710.1109/TPAMI.2012.182-s2.0-84866664464LiuZ.LiF. J.ZhangG. J.An external parameter calibration method for multiple cameras based on laser rangefinder201447195496210.1016/j.measurement.2013.10.029MazzeiL.MediciP.PanciroliM.A lasers and cameras calibration procedure for VIAC multi-sensorized vehiclesProceedings of the IEEE Intelligent Vehicles Symposium (IV '12)June 2012Alcala de Henares Madrid, Spain54855310.1109/IVS.2012.62322142-s2.0-84865044884GaoC.SpletzerJ. R.On-line calibration of multiple LIDARs on a mobile vehicle platformProceedings of the IEEE International Conference on Robotics and Automation (ICRA '10)May 2010Anchorage, Alaska, USA27928410.1109/ROBOT.2010.55098802-s2.0-77955811528GaoD.DuanJ.YangX.ZhengB.A method of spatial calibration for camera and radarProceedings of the 8th World Congress on Intelligent Control and Automation (WCICA '10)July 2010Jinan, China6211621510.1109/WCICA.2010.55544112-s2.0-77958124816YuL. L.MengP.YouZ.Separated calibration of a camera and a laser-finder for robotic heterogeneous sensors201310112OhtaniK.LiL.BabaM.Laser rangefinder calibration based on genetic algorithmProceedings of the 51st Annual Conference on of the Society of Instrument and Control Engineers of Japan (SICE '12)August 2012Akita, Japan123412372-s2.0-84869476638LMS Technical Description
http://sicktoolbox.sourceforge.net/docs/sick-lms-technical-description.pdf
HaJ.Extrinsic calibration of a camera and laser range finder using a new calibration structure of a plane with a triangular hole20121061240124410.1007/s12555-012-0619-72-s2.0-84879617097ShinY.ParkJ.BaeJ.BaegM.A study on reliability enhancement for laser and camera calibration201210110911610.1007/s12555-012-0112-32-s2.0-84862016052RahmanT.KrouglicofN.An efficient camera calibration technique offering robustness and accuracy over a wide range of lens distortion201221262663710.1109/TIP.2011.2164421MR29056962-s2.0-84856237899OsgoodT. J.HuangY.Calibration of laser scanner and camera fusion system for intelligent vehicles using Nelder-Mead optimization201324303510110.1088/0957-0233/24/3/0351012-s2.0-84874321930http://www.vision.caltech.edu/bouguetj/calib_doc/