Design, Development, and Deployment of Real-Time Sensor Fusion (CnW + EKF) for a Linux-Based Embedded System Using Qt-Anywhere

. This paper describes the design, development, and implementation of a real-time sensor fusion system that utilizes the classi ﬁ cation and weighing plus extended Kalman ﬁ lter algorithm to derive heading for navigation using inexpensive sensors. This algorithm was previously tested only through postprocessing using MATLAB and is now reprogrammed using Qt and deployed on a Linux-based embedded board for real-time operation. Various data from inexpensive sensors such as global positioning system devices, an electronic compass, and an inertial measurement unit were utilized to ultimately derive a more reliable and accurate heading value. The algorithm ﬂ ow can be described with the GPS values ﬁ rst being evaluated and classi ﬁ ed which are then fused with the EC heading using classi ﬁ cation and weighing, whose result is then passed through an EKF to fuse with the IMU data. Real-time tests and trials were done to prove the operational capability of the developed process. The complete setup and con ﬁ guration processes of the systems for development and deployment via Qt are also provided for those interested to replicate the process.


Introduction
The fusion of cheap sensor devices to generate information that would have performance similar to that of more expensive systems is a continuing and exciting research field.Existing works utilized GPS (global positioning system), EC (electronic compass), IMU (inertial measure unit), or a combination of any of them with varying fusion methods utilized such as numerical discretization [1], Kalman filter or its variations [2,3], fuzzy logic [4], timing synchronization [5], dead reckoning [6], or ad hoc [7].
Sensor fusion is also utilized in various other sectors such as increasing the reliability of quality assessment and authentication of food and beverages [8] as well as a globally updated map of the plant and the dynamic information about velocities and positions of all automatic guided vehicles (AGVs) [9].One other factory application that utilized sensor fusion is the monitoring of machining operations dependent on rotary cutters [10].
It is also used for the real-time recognition of human action that relied on diverse modality sensors (inertial and depth of vision) [11] while another work presented a technique for indoor position tracking and localization of pedestrians [12].There is also the often-researched application of sensor data fusion for the pose estimation of a 3D mobile robot in indoor applications [13].There is also research that looked on the performance of sensor fusion systems [14]; on the other hand, there was another study that designed and developed an open-source tool that can be used for evaluating data fusion systems which are primarily focused on maritime surveillance systems [15].
In the field of maritime and ships, there are systems designed to assist the ship captain in entering or leaving a harbor [16] including the Advanced Sensor Module of the MUNIN project for autonomous and unmanned shipping [17].There are also related works on collision avoidance such as those of Flåten and Brekke [18] and Chen et al. [19].Works similar to ours that used sensor fusion in deriving a more accurate heading for navigation were by Hu and Huang [20] and Juang and Lin [21] while there were others that focused on the improved position and attitude aside from the heading such as those by Jaroś et al. [22], Bryne [23], Núñez et al. [24], and Feng-de et al. [25].
Cappello et al. applied sensor fusion to unmanned vehicles such as their work on integrated navigation and guidance systems (NGS) for small-sized UVs using low-cost off-theshelf sensors [26].They also worked on NGS for small-and medium-sized Remotely Piloted Aircraft Systems (RPAS) utilizing the GNSS-(global navigation satellite system-) and microelectromechanical system-(MEMS-) based IMU and vision-based navigation (VBN) sensor and the Aircraft Dynamics Model (ADM) that was treated as a virtual sensor [27,28].Other research works were implemented in an underwater environment that focused on robot pose estimation [29], towed array shape estimation [30], and passive target tracking [31].
These sensor fusion systems, especially the navigation systems, are usually deployed on embedded systems similar to ones that utilized field-programmable gate arrays (FPGA) with digital signal processors (DSP) [32].The work presented in this paper details the development and deployment of a real-time sensor fusion system on an embedded board based on a previously proposed system of ours that was just done postprocessing [33].Previously related works or our works are the real-time fusion of several GPS devices with an electronic compass initially on a notebook through classification and simplified weighing which was then followed by the first phase of real-time sensor fusion of three GPS devices and an electronic compass on an embedded board.There were also preceding studies done on postprocessing sensor fusion such as fuzzy logic and simplified classification and weighing.The current system was programmed using Qtanywhere on a Linux desktop system and then deployed on an embedded board with the Linux system.This paper is arranged with the theoretical background of the implemented sensor fusion algorithm given in Section 2. The system design and implementation are then detailed in Section 3 while the developed system is presented in Section 4 followed by the concluding remarks.

Theoretical Background
The simplified algorithm for the real-time implementation of the previous theoretical work done via postprocessing is given in Figure 1.The currently implemented system on an embedded board utilized forward azimuth (FAz), classification and weighing (CnW), and extended Kalman filter (EKF) in order to derive the fused heading value from multiple inexpensive GPS, an EC, and an IMU.
The overall system design is in Figure 2 showing the inputs from various devices and the corresponding data that are utilized in the process with a 64-bit Ubuntu Linux desktop system acting as the platform for the design and development process of the real-time sensor fusion system.The developed program was then deployed on an embedded board with 32-bit Ubuntu Linux system (both BeagleBone Black and FreeScale).The graphical representation of the operational steps is shown in Figure 3.
2.1.Proposed Method.The GPS and EC both were sampled at 10 Hz while the IMU was sampled at 100 Mz.The data polled from these sensors were utilized for calculation of the heading through FAz (GPS data), classification and weighing-I (FAz data), classification and weighing-II (Faz, EC, and CnW-I data), and extended Kalman filter (FAz, CnW-I, and CnW-II data).The EKF prediction and update process involves a set of data each from GPS (FAz, GPSEC_yaw)  (1) Solve for individual heading values of the GPS devices (GPSi_yaw) through FAz, if there are valid GPS position values.
(2) Evaluate the derived individual GPS heading values through CnW-I.
(3) Fuse the solved individual GPS heading values with the EC heading value through CnW-II to derive GPSEC_yaw.
(4) Assign GPSEC_yaw as the heading value for any GPS device that does not have a valid position value (GPSi_yaw = GPSEC_yaw).
(5) Fuse the GPS and IMU values through EKF with the IMU accelerometer value treated as the local gravity vector measurement [9].The inputs are lat i , long i , and GPSi_yaw while psi i _EKF, lat i _EKF, and long i _EKF are the output values.
(6) Repeat steps 1 to 5 for the next set of IMU, EC, and GPS values.
2.2.Heading Derivation.Forward azimuth is described by the US Army as the angular measurement in clockwise direction of the line created by two points with the north set as the reference [34].The equation for calculating the heading through forward azimuth is given in (1), taking note that the initial (lat 2 , long 2 ) and succeeding (lat 2 , long 2 ) position values are set so as to derive the value for initial heading.Other heading derivations were also tried such as centroid and direct arc tangent, but their performance was not as good as forward azimuth.(vi) Float real-time kinematics is "5." (vii) Dead reckoning or estimated fix is "6." (viii) Manual input mode is "7." (ix) Simulation mode is "8." In the case of HDOP values, we utilized our own classification as follows: (i) greater than 0 and less than or equal to 1 (0 < HDOP ≤ 1), it is "IDEAL." (ii) greater than 1 and less than or equal to 2 (1 < HDOP ≤ 2), it is "EXCELLENT." (iii) greater than 2 and less than or equal to 5 (2 < HDOP ≤ 5), it is "GOOD." (iv) greater than 5 and less than or equal to 10 (5 < HDOP ≤ 10), it is "MODERATE." (v) greater than 10 and less than or equal to 20 (10 < HDOP ≤ 20), it is "FAIR." (vi) greater than 20 and less than or equal to 1 (20 < HDOP ≤ 1), it is "POOR." The combined FIX and HDOP classification is now given as follows: (i) If FIX is 2 to 5 and HDOP is 1 to 2, then it is IDEAL, with weight of "3." (ii) If FIX is 1 and HDOP is 1 to 2, then it is EXCEL-LENT, with weight of "2." (iii) If FIX is 2 to 5 and HDOP is 3 to 5, then it is EXCEL-LENT, with weight of "2." (iv) If FIX is 1 and HDOP is 3 to 5, then it is GOOD, with weight of "1." (v) If FIX is 2 to 5 and HDOP is 6 to 10, then it is GOOD, with weight of "1." (vi) Else, it is BAD, with weight of "0."  3

Journal of Sensors
The resulting weights are then used as inputs along with the GPS and EC heading values into (2) in order to solve for the CnW-II heading.
where n is the number of GPS devices, idealValue is the weight value given for IDEAL, h f used is the fused heading from electronic compass and GPS, w allGPS is the weight assigned for the calculated GPS heading, w EC is the weight assigned to the electronic compass, h allGPS is the fused calculated heading for all the "n" GPS, w i is the weight value given to the "ith" GPS, h i is the derived heading of the "ith" GPS, and h f used is assigned as the "GPSEC_yaw." 2.4.Extended Kalman Filter.The extended Kalman filter is utilized to derive the state estimate through the following general steps: (i) predicting the state and error covariance, (ii) deriving the Kalman gain, (iii) finding the time update of estimate, and then (iv) solving for the time update of error covariance.The observability and controllability of the Kalman filter or extended Kalman filter have been extensively studied and proved as can be read in the works of Trzuskowsky et al. [16], Bustamante et al. [15], and Simonetti et al. [14].
The nonlinear measurement and dynamic models are given as follows: where x k and z k are the state estimate and measurement, respectively.R k and Q k are the measurement noise and process noise covariance matrix and are assumed as positive definite.It is the linearity of the system which is why EKF was chosen for this work.The predicted state x − k+1 and error covariance of the predicted state P − k+1 are derived as follows: where F k is the nonlinear dynamic model Jacobian matrix and P k is the updated covariance matrix of the state for the previous time step k.The predicted measurement z − k+1 is calculated as follows: The innovation covariance P vv k+1 of the residual error between predicted and observed measurements is derived through where P yy k+1 = H k+1 P − k+1 H T k+1 is the output covariance, H k+1 is the Jacobian of the measurement function h x − k+1 evaluated about the state prediction x − k+1 , and R k+1 is the measurement noise covariance of the sensor at time k + 1.
The Kalman gain is now derived through The Kalman gain is computed by deriving the product of the predicted cross-correlation matrix with inverse of the innovation covariance matrix.The state distribution in the EKF algorithm is approximated by a Gaussian random variable that is propagated by first-order linearization of nonlinear functions.
The state vector x k of the model is composed of the latitude, longitude, and heading as elements, x k = x N , y E , ψ T , which is described as On the other hand, the model for measurement h x k is given as The measured values for estimating the state (latitude, longitude, and heading angle) are from three individual GPS devices and an EC.Process noise covariance (Q) and measurement noise covariance (R) values are set depending on the measured values as follows:

Design and Implementation
The system was designed and programmed using RBCDWBPA (Rapid By-Customer Demand with Business Process Approach) in tandem with the SFA (Systems 4

Journal of Sensors
Features Analysis) development method on a 64-bit desktop computer with the Ubuntu Linux operating system using the Qt-anywhere 4.8-5 open-source version.It was initially developed on the desktop Linux box and then crosscompiled on the same Linux box for deployment to an ARM-based embedded board.It is not possible to directly connect the EC we are using to the serial port of both the Linux box and the embedded board due to the high-power output of their serial ports which would overload the serial port of the board.The EC therefore was connected to a simple Arduino-based system that was in turn polled by the Linux box or the embedded board.
3.1.Data and Devices.Three UIGGUB02-R001 from u-blox were utilized as inexpensive GPS receivers and were polled using the NMEA protocol.GPS data utilized in the sensor fusion process are timestamp (time when fix was taken), latitude, longitude, and COG (course over ground) which are all extracted from the RMC sentence (Recommended Minimum).On the other hand, the HDOP (Horizontal Dilution of Precision) and FIX (type of fix) are extracted from the GGA sentence which is the source for essential fix information.The EC used in this work was a Devantech Ltd.SEN0183 CMPS11 tilt-compensated model while the IMU was an EBIMU-9DOFV3 heading and attitude referencing system.Two embedded boards were utilized to demonstrate the generic design of the real-time sensor fusion system: namely, a FreeScale i-MX6 and a BeagleBone Black (BBB).
A touchscreen with cape for a BeagleBone Black from Waveshare was used to serve as display while the FreeScale was connected to a HDMI-compatible display device.All the devices used in the implementation of the real-time sensor fusion system are shown in Figure 4.

Setting Up the System for Programming and Deployment.
There are several steps that need to be done in setting up and configuring the desktop system for designing and programming the sensor fusion system as well as for crosscompiling and then deploying it on an embedded board such as the BBB.These important configurations and settings are broken down into four major substeps as follows: (i) Configure TSLIB for Qt-desktop and Qt-arm on freshly installed Ubuntu 15.04-LTS.
(iii) Configure fresh BBB for deployment.
(iv) Configure Qt-Creator for writing and deploying the program (both desktop and cross-compile versions).
The Qt utilized in this work is the Qt-4.8.5-anywhere open-source version while the toolchain used for compiling the fusion algorithm into an ARM-compatible system is the gcc-linaro-arm-linux-gnueabihf.TSLIB was also utilized to enable the touchscreen mode of the LCD display connected to a BBB.
A detailed step-by-step procedure for the given configuration settings is given in the Appendix.These steps are the result of various experimentations and tests that were done in order to be able to easily replicate the system configuration and settings needed to design, develop, and deploy a system onto an embedded system.

TinyEKF Library.
The TinyEKF library by Simon D. Levy, which is available from GitHub (https://github.com/simondlevy/TinyEKF), was chosen to implement the EKF on an embedded board system.It is a straightforward C/ C++ execution of the EKF that is generally enough for the implementation on STM32, Arduinos, and other microcontrollers.It is very suitable for utilization especially due to its utilization of static or compile-time memory allocation instead of "malloc" or "new." There were customizations done on the TinyEKF library for it to be implemented with our sensor fusion system.The first thing was to make sure that the library would be recognized and compiled on a Qt environment 5 Journal of Sensors wherein they were initially converted to cpp files and then additional Qt-related header files were included especially "math.h"and "qmath.h." For the tinyekf.cppsource file, the following changes were implemented: (i) Comment out the line typedef struct ekf_t declaration.

Operating System Configuration.
It is recommended to assign persistent names to all sensor devices so that the compiled system can be uploaded to similar embedded boards without the need for unique setting up and configuration of each fusion system as long as the same embedded board systems are utilized.This is to avoid the problem of setting the correct serial port to which a specific sensor is connected, that is, GPS, EC, and IMU.The Ubuntu Linux operating system usually assigns the next available serial port device name to the currently connected serial device, such as "/dev/ttyACM0," "/dev/ttyACM1," "/dev/ttyACM2," "/dev/ttyACM3," and "/dev/ttyUSB0" for the GPS1, GPS2, GPS3, EC, and IMU devices, respectively.A problem arises when the serial device gets disconnected and gets reconnected again since then there is a distinct probability that it will now be assigned a different available serial device name such as "/dev/ttyACM4," "/dev/ttyACM5," and "/dev/ttyACM6."When this happens, the correct information from the specified sensor cannot be retrieved due to naming convention in the fusion system, that is, if the serial port device name in the fusion system for GPS1 is set as "/dev/ttyACM0" but the device is actually now recognized as "/dev/ttyACM4" by the embedded board system.This situation is addressed by using persistent names to the sensor devices connected via the serial port so that even if the sensors get disconnected and connected, the necessary information can be extracted from the correct specified sensor.Given as follows are the steps in setting up persistent names in the Ubuntu Linux system which is applicable for both desktop and embedded board systems.

Open a terminal and run the command "udevadm
info -a -n /dev/tty/ACM0." 3.5.Multithreading.The multithreading capability of the Ubuntu system and the embedded boards was harnessed to simultaneously retrieve data from the various sensor devices as well as implement the various individual processes such as CnW-I, FAz, CnW-II, and EKF.There are a total of eight (8) threads running simultaneously.These threads and their specific jobs are described as follows: (1) Thread 1 is the main thread for the fusion system.(3) Thread 3 polls the data and performs CnW-I and then FAz on GPS2.
(4) Thread 4 polls the data and performs CnW-I and then FAz on GPS3.
(5) Thread 5 polls the data from EC.
(8) Thread 8 performs EKF on the data from GPS1 (those from GPS2 or GPS3 can also be used).
This multithreading design and the interconnection between the threads are graphically represented in Figure 6.

The Resulting Real-Time Sensor Fusion System
The sensor fusion was successfully implemented in real time on a Linux system desktop whose result is given in Figure 7.
The same system was then cross-compiled in Qt for deployment on an ARM-based FreeScale board as well as on a BBB.
The fusion system was able to perform as desired with the display shown in the LCD (Figure 8) and the output monitored through a remote login console on the embedded board (Figure 9).The main button for starting and stopping the whole sensor fusion process is labeled as "Start Sensor Fusion" which switches to "Stop Sensor Fusion" when the fusion process is running.The sensor fusion system developed also allowed the user to be able to control the polling of data from each of the sensors (start/stop GPS1, start/stop GPS2, start/stop GPS3, start/ stop EC1, and start/stop IMU1).This was done so that it is possible to test the performance of the system by artificially introducing loss of data from either one or more of the sensors.It is also possible to start/stop the CnW-I and CnW-II processes as well as start the EKF process.
The system was able to run and still perform sensor fusion correctly when the sensor devices are disconnected and then reconnected or even switching the serial ports they are attached to.This is the main reason why it is important to implement the assigning of persistent names in the Ubuntu Linux systems of the embedded boards.This was tested for both the FreeScale and BBB with the same program deployed into it.

Conclusion
The real-time implementation of classification and weighing plus extended Kalman filter sensor fusion for the derivation of more accurate heading has been described.The theoretical background of the sensor fusion algorithm to the setting up and configuration of the desktop system and then the design and development using Qt-anywhere were described.The target embedded boards are then set up with the Ubuntu Linux system and then configured with the necessary libraries and drivers to be able to run the fusion system.The realtime sensor fusion system performed as designed, and the same configuration steps given in this paper allowed for the ease of deployment of the same system on an embedded board with very minimal changes that need to be done.Future work should focus on the addition of the covariance intersection algorithm into the real-time system to further improve the accuracy of the designed system.
(ii) Click "Build & Run" on the left window pane.
(iii) Select "Qt Versions" tab of the "Build & Run" window plane.
(iv) Select "Manual" option then click then "Add" button on the right.Browse to the location of the compiled Qt-for-embedded earlier, in our case it was "/usr/local/Qt-4.8.5-arm/bin/ qmake."Press "Open" button or double-click on "qmake" file in the browser window.* * there will be warning: "No compiler can produce code for this Qt version.Please define one or more compilers." (v) Leave "Version Name" as it is and then Click "Apply" button.* Configure "Build & Run"-"Compilers" option (i) Go to "Tools"->"Options" menu.
(ii) Click "Build & Run" on the left window pane.
(ii) Click "Build & Run" on the left window pane.
(ii) Click "Devices" on the left window pane.
(iii) Click "Add", select "Generic Linux Device", and then click "Start Wizard."(vi) Click the "Finish" button.* * The connection to the device will be tested; press the "Close" button.
(ii) Click "Build & Run" on the left window pane.
(iii) Select "Kits" tab of the "Build & Run" window plane.

Figure 3 :
Figure 3: Graphical representation of the operational steps.

Figure 7 :
Figure 7: Real-time on a desktop.

Figure 8 :
Figure 8: Program display on the embedded board.

Figure 9 :
Figure 9: Fused heading and position on the embedded board.