Multi-camera Visual Odometry for Motion-Constrained Environments

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Intelligent mobile robots leverage visual odometry (or SLAM, simultaneous localization and mapping) techniques to achieve localization and planning tasks in environments devoid of a priori map information. Employing multiple sensors has the potential to compensate for individual shortcomings, thereby enhancing the localization performance of the system, while introducing added complexity to system design. In this paper, we propose the visual odometry utilizing multiple monocular cameras with non-overlapping FoV (Field of View) for a mobile robot constrained to move in a two-dimensional horizontal plane. Our approach maximizes the observation of each monocular camera, addresses the challenge of up to scale inherent in monocular methods, and introduces an optimization strategy for online extrinsic calibration tailored for motion-constrained scenarios.

Original languageEnglish
Title of host publicationAdvances in Guidance, Navigation and Control - Proceedings of 2024 International Conference on Guidance, Navigation and Control Volume 11
EditorsLiang Yan, Haibin Duan, Yimin Deng
PublisherSpringer Science and Business Media Deutschland GmbH
Pages34-44
Number of pages11
ISBN (Print)9789819622399
DOIs
StatePublished - 2025
EventInternational Conference on Guidance, Navigation and Control, ICGNC 2024 - Changsha, China
Duration: 9 Aug 202411 Aug 2024

Publication series

NameLecture Notes in Electrical Engineering
Volume1347 LNEE
ISSN (Print)1876-1100
ISSN (Electronic)1876-1119

Conference

ConferenceInternational Conference on Guidance, Navigation and Control, ICGNC 2024
Country/TerritoryChina
CityChangsha
Period9/08/2411/08/24

Keywords

  • extrinsic calibration
  • multi-camera
  • visual odometry

Fingerprint

Dive into the research topics of 'Multi-camera Visual Odometry for Motion-Constrained Environments'. Together they form a unique fingerprint.

Cite this