跳到主要导航 跳到搜索 跳到主要内容

NID-SLAM: Neural Implicit Representation-based RGB-D SLAM in Dynamic Environments

  • Ziheng Xu
  • , Jianwei Niu
  • , Qingfeng Li
  • , Tao Ren*
  • , Chen Chen
  • *此作品的通讯作者
  • Beihang University
  • CAS - Institute of Software

科研成果: 书/报告/会议事项章节会议稿件同行评审

摘要

Neural implicit representations have been explored to enhance visual SLAM algorithms, especially in providing high-fidelity dense map. Existing methods operate robustly in static scenes but struggle with the disruption caused by moving objects. In this paper we present NID-SLAM, which significantly improves the performance of neural SLAM in dynamic environments. We propose a new approach to enhance inaccurate regions in semantic masks, particularly in marginal areas. Utilizing the geometric information present in depth images, this method enables accurate removal of dynamic objects, thereby reducing the probability of camera drift. Additionally, we introduce a keyframe selection strategy for dynamic scenes, which enhances camera tracking robustness against large-scale objects and improves the efficiency of mapping. Experiments on publicly available RGB-D datasets demonstrate that our method outperforms competitive neural SLAM approaches in tracking accuracy and mapping quality in dynamic environments.

源语言英语
主期刊名2024 IEEE International Conference on Multimedia and Expo, ICME 2024
出版商IEEE Computer Society
ISBN(电子版)9798350390155
DOI
出版状态已出版 - 2024
活动2024 IEEE International Conference on Multimedia and Expo, ICME 2024 - Niagra Falls, 加拿大
期限: 15 7月 202419 7月 2024

出版系列

姓名Proceedings - IEEE International Conference on Multimedia and Expo
ISSN(印刷版)1945-7871
ISSN(电子版)1945-788X

会议

会议2024 IEEE International Conference on Multimedia and Expo, ICME 2024
国家/地区加拿大
Niagra Falls
时期15/07/2419/07/24

指纹

探究 'NID-SLAM: Neural Implicit Representation-based RGB-D SLAM in Dynamic Environments' 的科研主题。它们共同构成独一无二的指纹。

引用此