├── pic
├── sensors.jpeg
├── vehicle.jpeg
├── 3d_2d_sensor.jpg
├── cam_pixels.png
├── lidar_points.png
├── lidar_imu_calib.png
└── sensor-assembling.jpeg
└── README.md
/pic/sensors.jpeg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/zouyajing/step-by-step-multi-sensor-fusion/HEAD/pic/sensors.jpeg
--------------------------------------------------------------------------------
/pic/vehicle.jpeg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/zouyajing/step-by-step-multi-sensor-fusion/HEAD/pic/vehicle.jpeg
--------------------------------------------------------------------------------
/pic/3d_2d_sensor.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/zouyajing/step-by-step-multi-sensor-fusion/HEAD/pic/3d_2d_sensor.jpg
--------------------------------------------------------------------------------
/pic/cam_pixels.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/zouyajing/step-by-step-multi-sensor-fusion/HEAD/pic/cam_pixels.png
--------------------------------------------------------------------------------
/pic/lidar_points.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/zouyajing/step-by-step-multi-sensor-fusion/HEAD/pic/lidar_points.png
--------------------------------------------------------------------------------
/pic/lidar_imu_calib.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/zouyajing/step-by-step-multi-sensor-fusion/HEAD/pic/lidar_imu_calib.png
--------------------------------------------------------------------------------
/pic/sensor-assembling.jpeg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/zouyajing/step-by-step-multi-sensor-fusion/HEAD/pic/sensor-assembling.jpeg
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # step-by-step-multi-sensor-fusion
2 |
3 | The aim of the project is to build a multi-sensor localization and mapping system. Its contents are related to the below toppics:
4 | - `sensor assembling`
5 | - `sensor testing`
6 | - `sychonization and calibration`
7 | - `dataset recording`
8 | - `main code of the multi-sensorlocalization and mapping system`
9 |
10 | ## sensor assembling
11 |
12 | Four different sensors are utilized including:
13 | - `GNSS receiver` ublox M8T
14 | - `IMU` xsens-mti-g-710
15 | - `camera` realsense D435
16 | - `Lidar` VLP-16
17 |
18 | `xsens-mti-g-710` is a GNSS/IMU sensor, which can outpus both GNSS position and IMU data. `realsense D435` is stereo camera, which can output one RGB image, two IR images and one depth image at the same time, but only one RGB or IR will be used in this project.
19 |
20 | The above sensors are assembled on a aluminum plate by screws.
21 |
22 |
23 |
24 |
25 | Their 2D(left) and 3D(right) body coordinate systems or are appoximate to:
26 |
27 |
28 |
29 |
30 | ## sensor testing
31 |
32 | The ROS drivers of utilized sensors are installed and tested under the operation system `Ubuntu 18.04 + ROS melodic`.
33 | - [ublox driver](https://github.com/HKUST-Aerial-Robotics/ublox_driver)
34 | - [xsens mti driver](http://wiki.ros.org/xsens_mti_driver)
35 | - [velodyne lidar driver](http://wiki.ros.org/velodyne/Tutorials/Getting%20Started%20with%20the%20Velodyne%20VLP16)
36 | - [realsense camera driver](https://github.com/IntelRealSense/realsense-ros)
37 |
38 | There is also another [ublox ROS driver](https://github.com/KumarRobotics/ublox) maintained by [KumarRobotics](https://github.com/KumarRobotics), which is more popular. The output topics from these two drivers are different, but both can be transferred to `RINEX` file easily. `xsens-mti-g-710` can provide GNSS position, but its RAW GNSS measurments are not available, so `ublox M8T`is utilized.
39 |
40 | ## sensor calibration
41 |
42 | The calibration includes two parts:
43 | - `time synchronization`
44 | - `space calibaration`
45 |
46 | Only a coarse time synchronization is performed to sycn the time clock between the computer and GNSS by
47 | ```
48 | sudo su
49 | source /opt/ros/kinetic/setup.bash
50 | source ${YOUR_CATKIN_WORKSPACE}/devel/setup.bash
51 | rosrun ublox_driver sync_system_time
52 | ```
53 | Next step is to use PPS from the GNSS receiver to trigger all other sensors.
54 |
55 | The space transformation between lidar and camera is computed by 3D-2D PnP algorithm implemented by [heethesh](https://github.com/heethesh/lidar_camera_calibration). Two example pics of manually picking 2D pixls and 3D points are shown below.
56 |
57 |
58 |
59 |
60 |
61 | The RMSE and transformation matrix:
62 |
63 | ```
64 | RMSE of the 3D-2D reprojection errors: 1.194 pixels.
65 | T_cam_lidar:[
66 | -0.05431967, -0.99849605, 0.0074169, 0.04265499,
67 | 0.06971394, -0.01120208, -0.99750413, -0.14234957,
68 | 0.99608701, -0.05366703, 0.07021759, -0.03513243,
69 | 0, 0, 0, 1]
70 | ```
71 |
72 | The The space transformation between lidar and IMU is computed by (a) hand-eye calibration and (b) batch optimization, implemented by [APRIL-ZJU](https://github.com/APRIL-ZJU/lidar_IMU_calib). One example calibration image is shown below.
73 |
74 |
75 |
76 |
77 |
78 | The transformation matrix and time offest:
79 | ```
80 | T_imu_lidar:[
81 | -0.9935967, 0.1120969, -0.0141367, -0.276365,
82 | -0.1121103, -0.9936957, 0.0001571, -0.0448481 ,
83 | -0.0140300, 0.0017409, 0.9999000, 0.155901,
84 | 0, 0, 0, 1]
85 | time offset: -0.015
86 | ```
87 |
88 | Also, the intrtrinsic calibration parameters of the color camera inside realsense D435 is
89 |
90 | ```
91 | height: 480
92 | width: 640
93 | distortion_model: "plumb_bob"
94 | D: [0.128050, -0.258338, -0.000188, -0.000001, 0.000000]
95 | K: [611.916727, 0.0, 322.654269, 0.0, 612.763638, 244.282743, 0.0, 0.0, 1.0]
96 | ```
97 |
98 | The extrinsic matrix between IMU and camera is computed by:
99 | ```
100 | T_imu_cam = T_imu_lidar * T_cam_lidar. inverse() =
101 | [-0.0580613, -0.0564218, -0.996717, -0.316937,
102 | 0.986113, 0.0187904, 0.165011, -0.0784387,
103 | 0.00643999, -0.998402, 0.056142, 0.0154766,
104 | 0, 0, 0, 1]
105 |
106 | ```
107 |
108 | ## dataset recording
109 |
110 | Before mounting the sensor platform on a real vehicle, I put it on a trolly to collect datasets with my laptop. The test dataset is stored on [google drive](https://drive.google.com/file/d/1J6ti1XpPPSNJgayWC-zmRKHtfiP7bKc1/view?usp=sharing).
111 | The scripts to collect the datasets are:
112 | ```
113 | 1. open one terminal to launch VLP-16
114 | source ${YOUR_CATKIN_WORKSPACE}/devel/setup.bash
115 | roslaunch velodyne_pointcloud VLP16_points.launch
116 | 2. open one terminal to launch realsense D435 camera
117 | source ${YOUR_CATKIN_WORKSPACE}/devel/setup.bash
118 | roslaunch realsense2_camera rs_camera.launch
119 | 3. open one terminal to launch xsens-mti-g-710
120 | source ${YOUR_CATKIN_WORKSPACE}/devel/setup.bash
121 | roslaunch xsens_mti_driver xsens_mti_node.launch
122 | 4. open one terminal to launch ublox M8T
123 | source ${YOUR_CATKIN_WORKSPACE}/devel/setup.bash
124 | roslaunch ublox_driver ublox_driver.launch
125 | 5. wait until the ublox output is stable and then sync time
126 | sudo su
127 | source /opt/ros/melodic/setup.bash
128 | source ${YOUR_CATKIN_WORKSPACE}/devel/setup.bash
129 | rosrun ublox_driver sync_system_time
130 | 6. use [rviz] or [rostopic echo] to check the relative messages, if all of them are valid,record the relative topics
131 | rosbag record /camera/color/image_raw /velodyne_points /gnss /filter/positionlla /filter/quaternion /imu/data /ublox_driver/ephem /ublox_driver/glo_ephem /ublox_driver/iono_params /ublox_driver/range_meas /ublox_driver/receiver_lla /ublox_driver/receiver_pvt /ublox_driver/time_pulse_info
132 | ```
133 | These topics are about:
134 | - `/camera/color/image_raw` is the color image data from `realsense D435` .
135 | - `/velodyne_points` is the lidar points from `VLP-16`.
136 | - `/imu/data` is the imu data from `xsens-mti-g-710`
137 | - `/gnss` is the GNSS output from `xsens-mti-g-710`
138 | - `/filter/positionlla` is the filtered position from `xsens-mti-g-710`
139 | - `/filter/quaternion` is the filtered quaternion from `xsens-mti-g-710`
140 | - others are from `ublox M8T`
141 |
142 | ## multiple sensor fusion
143 |
144 | [multiple sensor fusion](https://github.com/zouyajing/multi_sensor_loclization_and_mapping)
145 |
146 | The implementation is based on KITTI dataset. Need to be adapted to self-collected dataset.
147 |
148 |
149 |
150 |
151 |
152 |
153 |
154 |
155 |
156 |
157 |
158 |
159 |
160 |
161 |
162 |
163 |
164 |
165 |
166 |
--------------------------------------------------------------------------------