Skip to main navigation Skip to search Skip to main content

Dynamic RGB-D visual odometry

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

The aim of this paper is to estimate the ego-motion of an RGB-D camera in dynamic environments. A semi-direct motion estimation pipeline is modified for the RGB-D camera. In order to avoid the impact of dynamic objects, a new mapping method based on scoring mechanism is proposed, which can effectively remove feature points on dynamic objects and results a map contains only static points. The method is evaluated not only with the TUM RGB-D benchmark but also using an Asus Xtion Pro Live camera in a dynamic office environment. The experimental results show that our method has higher accuracy in dynamic environments and has considerable accuracy in static environments. In some high dynamic scenes, the accuracy of our method is more than 7 times higher than other RGB-D visual odometry algorithms.

Original languageEnglish
Title of host publication2017 IEEE International Conference on Robotics and Biomimetics, ROBIO 2017
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1-6
Number of pages6
ISBN (Electronic)9781538637418
DOIs
StatePublished - 2 Jul 2017
Event2017 IEEE International Conference on Robotics and Biomimetics, ROBIO 2017 - Macau, China
Duration: 5 Dec 20178 Dec 2017

Publication series

Name2017 IEEE International Conference on Robotics and Biomimetics, ROBIO 2017
Volume2018-January

Conference

Conference2017 IEEE International Conference on Robotics and Biomimetics, ROBIO 2017
Country/TerritoryChina
CityMacau
Period5/12/178/12/17

Keywords

  • RGB-D camera
  • accuracy
  • dynamic envrionments
  • visual odometry

Fingerprint

Dive into the research topics of 'Dynamic RGB-D visual odometry'. Together they form a unique fingerprint.

Cite this