TY - GEN
T1 - Inertial monocular visual odometry based on RUPF algorithm
AU - Hou, Juanrou
AU - Wang, Zhanqing
AU - Zhang, Yanshun
N1 - Publisher Copyright:
© 2019 Technical Committee on Control Theory, Chinese Association of Automation.
PY - 2019/7
Y1 - 2019/7
N2 - The accurate autonomous positioning performance of the unmanned vehicle is an important prerequisite for autonomous navigation. The combination of visual and inertial sensors is a cheap, compact and complementary solution. In this paper, in order to improve the positioning accuracy of the monocular vision/IMU integrated system, an visual inertial odometry based on the Random sampling consistency of the Unscented Kalman Particle Filter (RUPF) was proattituded. With respect to feature point tracking, the method of optical flow tracking is used to reduce the time required for feature matching. At the same time, a random sampling consistency algorithm is used to eliminate the mis-matching feature points in the three-view feature point tracking process, and then the Epipolar geometry and the three constraints formed by the focus tensor is integrated into the observation equations, and unconstrained particle filtering is used to fuse the IMU and monocular visual information. The fusion algorithm is verified by the KITTI sports car data set. The experimental results show that the visual inertial odometry based on the RUPF filtering algorithm has accurate positioning accuracy, the final positioning error is controlled at around 0.19%.
AB - The accurate autonomous positioning performance of the unmanned vehicle is an important prerequisite for autonomous navigation. The combination of visual and inertial sensors is a cheap, compact and complementary solution. In this paper, in order to improve the positioning accuracy of the monocular vision/IMU integrated system, an visual inertial odometry based on the Random sampling consistency of the Unscented Kalman Particle Filter (RUPF) was proattituded. With respect to feature point tracking, the method of optical flow tracking is used to reduce the time required for feature matching. At the same time, a random sampling consistency algorithm is used to eliminate the mis-matching feature points in the three-view feature point tracking process, and then the Epipolar geometry and the three constraints formed by the focus tensor is integrated into the observation equations, and unconstrained particle filtering is used to fuse the IMU and monocular visual information. The fusion algorithm is verified by the KITTI sports car data set. The experimental results show that the visual inertial odometry based on the RUPF filtering algorithm has accurate positioning accuracy, the final positioning error is controlled at around 0.19%.
KW - Inertial system
KW - Integrated navigation
KW - Multiple view geometry
KW - Unscented kalman particle filter
KW - Visual odometry
UR - https://www.scopus.com/pages/publications/85074419468
U2 - 10.23919/ChiCC.2019.8865490
DO - 10.23919/ChiCC.2019.8865490
M3 - 会议稿件
AN - SCOPUS:85074419468
T3 - Chinese Control Conference, CCC
SP - 3885
EP - 3891
BT - Proceedings of the 38th Chinese Control Conference, CCC 2019
A2 - Fu, Minyue
A2 - Sun, Jian
PB - IEEE Computer Society
T2 - 38th Chinese Control Conference, CCC 2019
Y2 - 27 July 2019 through 30 July 2019
ER -