Combined visual and inertial navigation for an unmanned aerial vehicle

Jonathan Kelly, Srikanth Saripalli, Gaurav S. Sukhatme

Research output: Chapter in Book/Report/Conference proceedingConference contribution

38 Scopus citations

Abstract

We describe an UAV navigation system which combines stereo visual odometry with inertial measurements from an IMU. Our approach fuses the motion estimates from both sensors in an extended Kalman filter to determine vehicle position and attitude. We present results using data from a robotic helicopter, in which the visual and inertial system produced a final position estimate within 1% of the measured GPS position, over a flight distance of more than 400 meters. Our results show that the combination of visual and inertial sensing reduced overall positioning error by nearly an order of magnitude compared to visual odometry alone.

Original languageEnglish (US)
Title of host publicationField and Service Robotics
Subtitle of host publicationResults of the 6th International Conference
EditorsChristian Laugier, Roland Siegwart
Pages255-264
Number of pages10
DOIs
StatePublished - Aug 18 2008

Publication series

NameSpringer Tracts in Advanced Robotics
Volume42
ISSN (Print)1610-7438
ISSN (Electronic)1610-742X

ASJC Scopus subject areas

  • Electrical and Electronic Engineering
  • Artificial Intelligence

Fingerprint Dive into the research topics of 'Combined visual and inertial navigation for an unmanned aerial vehicle'. Together they form a unique fingerprint.

  • Cite this

    Kelly, J., Saripalli, S., & Sukhatme, G. S. (2008). Combined visual and inertial navigation for an unmanned aerial vehicle. In C. Laugier, & R. Siegwart (Eds.), Field and Service Robotics: Results of the 6th International Conference (pp. 255-264). (Springer Tracts in Advanced Robotics; Vol. 42). https://doi.org/10.1007/978-3-540-75404-6_24