Skip to main navigation Skip to search Skip to main content

Research on Improving the Robustness and Positioning Accuracy of Visual SLAM Based on Point-Line Feature Matching

  • Ming Li
  • , Shumin Gu*
  • , Xinghua Hou
  • *Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

In order to solve the problem of the failure of SLAM algorithm due to insufficient feature information of a single point in low texture environments, an indoor visual/inertial localization algorithm combining point-line feature matching is proposed. By optimizing the LSD line feature extraction algorithm, the close fusion of vision and IMU data is achieved, and an adaptation factor is introduced in the optimization objective to reduce the processing time. This method reduces the absolute trajectory error and improves the positioning accuracy and system robustness.

Original languageEnglish
Title of host publication2024 6th International Academic Exchange Conference on Science and Technology Innovation, IAECST 2024
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1151-1154
Number of pages4
ISBN (Electronic)9798331507138
DOIs
StatePublished - 2024
Event6th International Academic Exchange Conference on Science and Technology Innovation, IAECST 2024 - Hybrid, Guangzhou, China
Duration: 6 Dec 20248 Dec 2024

Publication series

Name2024 6th International Academic Exchange Conference on Science and Technology Innovation, IAECST 2024

Conference

Conference6th International Academic Exchange Conference on Science and Technology Innovation, IAECST 2024
Country/TerritoryChina
CityHybrid, Guangzhou
Period6/12/248/12/24

Keywords

  • point and line features
  • simultaneous localization and map building
  • tight coupling
  • visual inertia fusion

Fingerprint

Dive into the research topics of 'Research on Improving the Robustness and Positioning Accuracy of Visual SLAM Based on Point-Line Feature Matching'. Together they form a unique fingerprint.

Cite this