MLIOM-AB: Multi-LiDAR-Inertial-Odometry and Mapping for Autonomous Buses

Research output: Contribution to journalArticlepeer-review

Abstract

Light detection and ranging (LiDAR) is a critical sensor for autonomous driving, offering precise 3-D environmental insights. However, large vehicles, such as autonomous buses, face unique challenges with single LiDAR systems, such as limited field of view (FOV) and sparse point cloud data. This article introduces a multi-LiDAR-inertial-wheel odometry and mapping system to overcome these challenges. The proposed system employs a multisensor fusion method that integrates data from multiple LiDARs, an inertial measurement unit (IMU), and wheel encoders. We utilize B-spline curves for IMU data interpolation and the error state Kalman filter (ESKF) for state estimation, significantly enhancing vehicle localization accuracy and robustness. To validate our approach, we presented a comprehensive multisensor simultaneous localization and mapping (SLAM) dataset using real-world data from autonomous buses in urban environments. Experimental results show that our method achieves state-of-the-art localization accuracy and reliability in complex environments. Across all sequences, the root mean square (rms) of the absolute translational error (ATE) for our method is 1.92 m, approximately 80% lower than the second-best method, which had an error of 10.21 m. The generated high-precision point cloud maps demonstrate the system's potential for autonomous driving applications, improving perception and navigation.

Original languageEnglish
Pages (from-to)28036-28048
Number of pages13
JournalIEEE Sensors Journal
Volume24
Issue number17
DOIs
StatePublished - 2024

Keywords

  • LiDAR odometry
  • light detection and ranging (LiDAR)
  • simultaneous localization and mapping (SLAM)

Fingerprint

Dive into the research topics of 'MLIOM-AB: Multi-LiDAR-Inertial-Odometry and Mapping for Autonomous Buses'. Together they form a unique fingerprint.

Cite this