Skip to main navigation Skip to search Skip to main content

Integrated vision/inertial navigation method of UAVs in indoor environment

Research output: Contribution to journalArticlepeer-review

Abstract

A new integrated navigation method based on inertial sensor, optical flow and visual odometry is proposed for self-navigation indoor in GPS-denied environment. An ORB optical flow based method is also proposed for estimating real-time three-axis velocity of the UAV. The algorithm improves the traditional pyramid Lucas-Kanade method using sparse optical flow based on feature points. The tracking of feature points is made more accurate by applying forward-backward tracking and random sampling consensus strategies. For position estimation, a visual odometry method with integrated vision/inertial navigation is adopted, which uses the artificial icon method, visual optical flow information and inertial navigation data. Finally, the velocity and position estimations from the proposed method are validated via actual flight test and via comparison with velocity measurement information from a PX4Flow module and a Guidance module and with locating information from movement capture system.

Original languageEnglish
Pages (from-to)176-186
Number of pages11
JournalBeijing Hangkong Hangtian Daxue Xuebao/Journal of Beijing University of Aeronautics and Astronautics
Volume44
Issue number1
DOIs
StatePublished - Jan 2018

Keywords

  • Multi-sensor fusion
  • ORB features
  • Optical flow
  • UAV
  • Vision navigation

Fingerprint

Dive into the research topics of 'Integrated vision/inertial navigation method of UAVs in indoor environment'. Together they form a unique fingerprint.

Cite this