Abstract
A new integrated navigation method based on inertial sensor, optical flow and visual odometry is proposed for self-navigation indoor in GPS-denied environment. An ORB optical flow based method is also proposed for estimating real-time three-axis velocity of the UAV. The algorithm improves the traditional pyramid Lucas-Kanade method using sparse optical flow based on feature points. The tracking of feature points is made more accurate by applying forward-backward tracking and random sampling consensus strategies. For position estimation, a visual odometry method with integrated vision/inertial navigation is adopted, which uses the artificial icon method, visual optical flow information and inertial navigation data. Finally, the velocity and position estimations from the proposed method are validated via actual flight test and via comparison with velocity measurement information from a PX4Flow module and a Guidance module and with locating information from movement capture system.
| Original language | English |
|---|---|
| Pages (from-to) | 176-186 |
| Number of pages | 11 |
| Journal | Beijing Hangkong Hangtian Daxue Xuebao/Journal of Beijing University of Aeronautics and Astronautics |
| Volume | 44 |
| Issue number | 1 |
| DOIs | |
| State | Published - Jan 2018 |
Keywords
- Multi-sensor fusion
- ORB features
- Optical flow
- UAV
- Vision navigation
Fingerprint
Dive into the research topics of 'Integrated vision/inertial navigation method of UAVs in indoor environment'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver