LiDAR-Inertial-Camera Calibration and SLAM (Tutor: Yunda SUN, wechat: syd17801034524)

Introduction

The tasks of this project are as follows:

1>Calibrate the extrinsic parameters of the LiDAR-Inertial-Camera sensor system.

2>Based on the calibrated extrinsic parameters, achieve tightly coupled LiDAR-Inertial-Camera SLAM.

Requirements

Basic Requirements:

1> The designed calibration module needs to support offline calibration without target.

uncalibrated image
Uncalibrated
calibrated image
Calibrated
2> The designed SLAM system should tightly couple point clouds , motion information, and images for pose estimation.

3> The localization and mapping results should be stable, with no significant drift.

Advanced Requirements:

4> Optimizing the calibration process to achieve automatic extrinsic calibration is encouraged.

5> Integrating the calibration module into the SLAM system to establish an automatic calibration-then-SLAM pipeline is encouraged.

6> Completing this project using self-collected data is encouraged.

Reference

LiDAR-Camera calibration: Pixel-level Extrinsic Self Calibration of High Resolution LiDAR and Camera in Targetless Environments.

LiDAR-Inertial-Camera SLAM: A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package.

Created on: Nov. 06, 2024