Indoor Vision/INS Integrated Mobile Robot Navigation Using Multimodel-Based Multifrequency Kalman Filter

In order to further improve positioning accuracy, this paper proposes an indoor vision/INS integrated mobile robot navigation method using multimodel-based multifrequency Kalman filter. Firstly, to overcome the insufficient accuracy of visual data when a robot turns, a novel multimodel integrated scheme has been investigated for the mobile robots with Mecanum wheels which can make fixed point angled turns. Secondly, a multifrequency Kalman filter has been used to fuse the position information from both the inertial navigation system and the visual navigation system, which overcomes the problem that the filtering period of the integrated navigation system is too long. The proposed multimodel multifrequency Kalman filter gives the root mean square error (RMSE) of 0.0184 m in the direction of east and 0.0977 m in north, respectively. The RMSE of visual navigation system is 0.8925 m in the direction of east and 0.9539 m in north, respectively. Experimental results show that the proposed method is effective.


Introduction
In recent decades, the autonomous capabilities of mobile robots have received increasing attention, and their applications have been widely used [1][2][3][4]. However, the robot navigation and positioning needs to be further studied to solve new challenges.
To realize mobile robot self-navigation, many approaches can be chosen [5,6]. e inertial navigation system (INS) and the vision navigation system (VNS) are the two most commonly used schemes which have gradually become standard sensors on mobile robot systems [7,8]. For example, an INS is used to achieve the self-navigation in a work [9], while a visual navigation system is used in another work's navigation tasks [10]. INS measures the navigation information employing the inertial measurement unit (IMU) without external interference, which is composed of a gyroscope, accelerometer, and magnetometer in three directions [11][12][13]. e navigation accuracy of the INS is high in a short period; nevertheless, a disadvantage is that its errors accumulate over time. When compared with INS, visual sensors have some advantages. For example, their weight is light, power consumption is low, and their cost is low. Also, it can provide a wealth of environmental features and information [14]. However, the processing speed of visual navigation is relatively slow. Moreover, the image of the visual sensor is easily affected by light, rain, snow, and dynamic interference. Also, it is difficult to provide fully continuous navigation and positioning information when it alone is used to conduct robot navigation and positioning.
Data fusion algorithms influence navigation accuracy much after the sensors themselves [15]. us, many approaches are proposed to improve data fusion. For instance, extended Kalman filter (EKF) is designed for the mobile robot localization in indoor environments [8]. However, the stronger a nonlinear problem is, the more nonignorable the high-order terms obtained by the expansion of Taylor series are. In [6,16,17], unscented Kalman filter (UKF) is investigated for INS/GPS integration system. UKF realizes the processes of nonlinear functions through unscented transform (UT) without linearization and preserves the higher-order terms.
is work tries to improve the localization accuracy of the indoor vision/INS integrated mobile robot navigation using multimodel-based multifrequency Kalman filter. Firstly, to overcome insufficient accuracy of visual data when the robot turns, a novel multimodel integrated scheme has been investigated for the mobile robots with Mecanum wheels, which means fixed point turning, that is, the robot position remains unchanged in the process of turning. Secondly, a multifrequency Kalman filter (MKF) is engaged for the fusion of the position given by both the inertial navigation and the visual navigation systems, which overcomes the problem that the filtering period of the integrated navigation system is too long. e experiment shows that the proposed multimodel-based multifrequency Kalman filter improves the positioning accuracy in an effective way. e rest content is organized as follows. e indoor integrated navigation scheme employing the vision and INS data is given in Section 2, while Section 3 introduces the multifrequency Kalman filter. Experiments can be found in Section 4. Finally, Section 5 summarizes main conclusions.

Indoor Integrated Navigation Scheme Employing the Vision and INS Data
In this section, we will design a navigation scheme to integrate and fuse the vision and INS data in indoor complex environments. Figure 1 is the diagram of the integrated navigation scheme. e main steps of integrated navigation strategy are as follows: e INS employs the inertial measurement unit (IMU) to compute the INS-based robot's position noted as Po I , velocity Ve I , acceleration Acc, and posture e vision navigation system (VNS) employs the binocular vision camera providing the vision-based robot's position Po V and posture using the images e data of IMU is used for judging the state of the robot e robot state is used to correct the measurement data preliminary e MKF provides the navigation information by fusing INS and vision data e INS's position is corrected using the output of the MKF, which is able to obtain accurate robot position information From the integrated navigation scheme mentioned above, we can find that one important step of the integrated navigation scheme is the judgement of the robot state. is paper focuses on robots with Mecanum wheel, and we assume that the robot can only turn with constant velocity at a fixed point. When the robot turns with constant velocity, the yaw and acceleration measured by the IMU are used to judge the turning state since the INS has relatively high positioning accuracy in a short period of time.
In this paper, we employ four functions to describe the robot's turning, which are listed in Table 1. When the following conditions are satisfied, we consider that the robot begins to turn: where Threshold s i , i ∈ [1,5] is the threshold at the beginning of the turns.
When equations (2) and (3) are satisfied, we consider that the robot's turn is over: where Threshold e i , i ∈ [1,3] is the threshold at the end of the turns.
If the robot is not turning, due to the different sampling periods of the VNS and INS, the filter period of the traditional single-rate Kalman filter algorithm can only be the slow cycle or the least common multiple of the two sampling periods, resulting in the sampling period of the traditional filter being too large, and the positioning accuracy becomes worse. One method to solve this hard issue is using a multifrequency Kalman filter.

Robot Navigation Coordinate System.
is paper uses two frames: body frame and navigation frame.
In the X R OY R Z R frame, the OY R axis is perpendicular to the OX R axis. e navigation frame used in this paper is the E-N-U coordinate system. e coordinate system origin is located at the mobile robot center. e X-axis points to the east and the Y-axis to the north. e Z-axis is perpendicular to the horizontal plane.
During positioning, the robot needs to convert the X R OY R Z R frame to the navigation frame. Meanwhile, the state information of the camera and the coordinate transformation of the camera positioning data in the X R OY R Z R frame should be acquired. us, the state information of the robot in the navigation frame can be obtained. Firstly, we rotate the X R OY R Z R frame according to Euler angles so that the X R OY R Z R frame and the navigation frame coincide. Euler angle includes heading angle α, pitch angle β, and roll angle c. e pitch angle is the difference between carrier's Y-axis and the axial plane. e roll angle represents the angle between the horizontal axis and the ground plane. e heading angle refers to the angle between the Y-axis projections on the horizontal plane and the direction of north. A transformation matrix C b n (in equation (4)) can be used to complete the conversion from the X R OY R Z R frame to the navigation frame. 2 Mathematical Problems in Engineering

Kalman Filter Multifrequency State
Equations. e linear Kalman filter in a filtering period is usually divided into two processes: time update and measurement update. e first process is responsible for forwarding the current state variables and error covariance values, so as to construct a priori estimation for the state at the next moment, and the measurement part is responsible for feedback, that is, combining the prior estimation and new measurement variables to construct an improved a posteriori estimate. e state formula of the multifrequency Kalman filter is listed in equation (5). where where (P I E,k , P I E,k ) is the INS position information at k time point, (P V E,k , P V E,k ) is the VNS position information at k time point, V(k) is the system noise at k time point, and the covariance of W(k) is R(k).
e sampling time of different sensors is different. For the navigation system integrating INS and version information, the sampling time of the IMU is more faster than that value of the VNS. At present, the sampling period of data fusion filter is usually consistent with the slow subsystem which may lose some data. To improve the localization accuracy, the multifrequency Kalman filter based on the integrated navigation scheme is proposed in this paper. e pseudocode of the multifrequency Kalman filter based on the multimodel integrated navigation scheme is listed in Algorithm 1. From the algorithm, one can infer that our multifrequency Kalman filter includes three parts, the first one is the judgement of the robot's status (lines [3][4][5]. In this part, if it is detected that the robot is turning, because we set   Here, T VNS is the VNS's sampling time and mod (•) is one MATLAB function. We can get Y(k) to do the measurement update of the multifrequency Kalman filter (line [8][9][10][11][12]. If there is only the update of the INS's measurement, we think that the INS's error is huge. us, both Q and R tend to infinity. us, K(k) � 0, and we can get that X(k) � X(k) and P(k) � P(k) (lines 12-15).

Experiments
In this section, the experimental verification is listed. Firstly, we introduce the experimental arrangement. Secondly, the judgement of robot running state will be investigated. Finally, the localization performance will be studied. e reference path is marked as red lines in the experimental scene as shown in Figure 2. During the experiment, the robot moved three meters forward and then three meters to the left to reach the end point. It can be seen from Figure 2 that there is little texture in the scene at the turning point of the robot, which is not conducive for the positioning of the visual navigation system.

Experimental Arrangement.
To verify the proposed data fusion algorithm, experimental verification is carried out. e test is done in a building of the University of Jinan, which is shown in Figure 2. e reference path, starting point, and navigation frame are shown in Figure 2. e testbed used in the test includes a mobile robot, an IMU, a vision, a binocular camera, and a computer. For the IMU, we employ the I9DOFv3, which includes ADXRS620, ADXL203, and HMC5983. Among them, the updating rate of ADXRS623 and ADXL203 ranges from 4 Hz to 2000 Hz. e output rate of HMC5983 is 220 Hz, and the size of the IMU is 50 × 30 × 15 mm. e highest frame rate of the MYNTAI binocular camera is 60FPS with baseline, 50.0 mm, focal length of 3.3 mm, and size of 124 × 33.3 × 32.5 mm. e INS-based robot's position and posture are measured by IMU.
e MYNTAI binocular camera is used in the test, which measures the vision-based position and posture. e computer is used to collect the sensor's data via software. It should be pointed out that an IMU, a vision, a binocular camera, and a computer are fixed on the mobile robot. For the test, the robot runs following the reference path, which is set up in advance. en, the IMU and binocular camera measure the mobile robot's position. e robot and X R OY R Z R frame used in the experiment are shown in Figure 3. e parameters used by MKF and MKF with multimodel in the experiment are listed in Table 2

Judgement of Robot Running State.
In this section, we will investigate the judgement of robot running state. e state of the robot is divided into stationary and motion state, among which motion state is divided into turning and going straight. When the robot is stationary, the yaw angle remains unchanged and the accelerometer value is zero; when the robot goes straight, the yaw angle remains unchanged while the accelerometer value is nonzero; when the robot turns at a  fixed point, the yaw angle changes continuously while the accelerometer value stays the same. e turning state of the robot is set to 0 when it is not turning and to 1 during turning. From Figure 4, it can be seen easily that the turning of the robot is easy to judge by using the conditions proposed in this paper.

Localization Error.
e localization performance of the method that we proposed is investigated in this section. To verify the MKF performance with the multimodel method, we compare the localization errors with those values of the INS, vision, and MKF. e reference trajectory, the trajectories estimated by INS, vision, MKF, and MKF with multimodel method are shown in Figure 5. From the figure, we can see easily that the error accumulation appears in INS trajectory, which indicates that the INS is not suitable for long-term navigation. When compared with the INS's trajectory, the trajectory estimated by the vision gets more close to the reference path in the initial stage.

Conclusions
In this work, the indoor vision/INS integrated mobile robot navigation using multimodel-based multifrequency Kalman filter has been proposed. To overcome the shortcomings of insufficient accuracy of visual data when the robot turns, a novel multimodel integrated scheme has been investigated for the mobile robots with Mecanum wheels, which considered that the position of the robot remains unchanged in the process of turning in place. For the data fusion filter, a multifrequency Kalman filter is engaged for fusion of the position information given by both the inertial navigation and the visual navigation systems, which overcomes the problem that the filtering period of the integrated navigation system is too long. e experiments verify that the proposed multimodel-based multifrequency Kalman filter can effectively improve the positioning accuracy.

Data Availability
e data used to support the findings of this study are available from the corresponding author upon request.