EGS-SLAM: RGB-D Gaussian Splatting SLAM With Events

  • Siyu Chen
  • , Shenghai Yuan
  • , Thien Minh Nguyen
  • , Zhuyu Huang
  • , Chenyang Shi
  • , Jing Jin
  • , Lihua Xie*
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Gaussian Splatting SLAM (GS-SLAM) offers a notable improvement over traditional SLAM methods, in enabling photorealistic 3D reconstruction that conventional approaches often struggle to achieve. However, existing GS-SLAM systems perform poorly under persistent and severe motion blur commonly encountered in real-world scenarios, leading to significantly degraded tracking accuracy and compromised 3D reconstruction quality. To address this limitation, we propose EGS-SLAM, a novel GS-SLAM framework that fuses event data with RGB-D inputs to simultaneously reduce motion blur in images and compensate for the sparse, discrete nature of event streams, enabling robust tracking and high-fidelity 3DGS reconstruction. Specifically, our system explicitly models the camera's continuous trajectory during exposure, supporting event and blur-aware tracking and mapping on a unified 3DGS scene. Furthermore, we introduce a learnable camera response function to align the dynamic ranges of events and images, along with a no-event loss to suppress ringing artifacts during reconstruction. We validate our approach on a new dataset comprising synthetic and real-world sequences with significant motion blur. Extensive experimental results demonstrate that EGS-SLAM consistently outperforms existing GS-SLAM systems in both trajectory accuracy and photorealistic 3DGS reconstruction.

Original languageEnglish
Pages (from-to)10003-10010
Number of pages8
JournalIEEE Robotics and Automation Letters
Volume10
Issue number10
DOIs
StatePublished - 2025

Keywords

  • Gaussian splatting
  • SLAM
  • event camera

Fingerprint

Dive into the research topics of 'EGS-SLAM: RGB-D Gaussian Splatting SLAM With Events'. Together they form a unique fingerprint.

Cite this