Abstract
Visual-inertial odometry is a key technology for robots to achieve autonomous localization. As an asynchronous vision sensor, the event cameras have complementary to the traditional cameras. For the scene of low light condition, high dynamic range and high-speed motion, the output of event camera and the traditional image are fused. A real-time visual inertial odometry using point and line features is proposed combined with the inertial measurement unit (IMU). An algorithm for generating an event image from event stream is proposed, a point-line feature detection method combined with events is designed, anda back-end sliding window optimization algorithm is designed based on the idea of visual-inertial tight-coupling. The dataset test and UAV flight test are conducted. The test results on the dataset show that, compared with the visual-inertial odometry using point and line features only on the traditional image, the proposed odometry can reduce the positioning error by more than 22% on average in the scene of high-speed motion, and it can reducethe positioning error by more than 59% on average in the scene of low light condition and high dynamic range.
| Translated title of the contribution | Event-combined Visual-inertial Odometry Using Point and Line Features |
|---|---|
| Original language | Chinese (Traditional) |
| Pages (from-to) | 3926-3937 |
| Number of pages | 12 |
| Journal | Binggong Xuebao/Acta Armamentarii |
| Volume | 45 |
| Issue number | 11 |
| DOIs | |
| State | Published - 30 Nov 2024 |
Fingerprint
Dive into the research topics of 'Event-combined Visual-inertial Odometry Using Point and Line Features'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver