TY - GEN
T1 - Computer vision based general object following for GPS-denied multirotor unmanned vehicles
AU - Pestana, Jesus
AU - Sanchez-Lopez, Jose Luis
AU - Saripalli, Srikanth
AU - Campoy, Pascual
PY - 2014
Y1 - 2014
N2 - The motivation of this research is to show that visual based object tracking and following is reliable using a cheap GPS-denied multirotor platform such as the AR Drone 2.0. Our architecture allows the user to specify an object in the image that the robot has to follow from an approximate constant distance. At the current stage of our development, in the event of image tracking loss the system starts to hover and waits for the image tracking recovery or second detection, which requires the usage of odometry measurements for self stabilization. During the following task, our software utilizes the forward-facing camera images and part of the IMU data to calculate the references for the four on-board low-level control loops. To obtain a stronger wind disturbance rejection and an improved navigation performance, a yaw heading reference based on the IMU data is internally kept and updated by our control algorithm. We validate the architecture using an AR Drone 2.0 and the OpenTLD tracker in outdoor suburban areas. The experimental tests have shown robustness against wind perturbations, target occlusion and illumination changes, and the system's capability to track a great variety of objects present on suburban areas, for instance: walking or running people, windows, AC machines, static and moving cars and plants.
AB - The motivation of this research is to show that visual based object tracking and following is reliable using a cheap GPS-denied multirotor platform such as the AR Drone 2.0. Our architecture allows the user to specify an object in the image that the robot has to follow from an approximate constant distance. At the current stage of our development, in the event of image tracking loss the system starts to hover and waits for the image tracking recovery or second detection, which requires the usage of odometry measurements for self stabilization. During the following task, our software utilizes the forward-facing camera images and part of the IMU data to calculate the references for the four on-board low-level control loops. To obtain a stronger wind disturbance rejection and an improved navigation performance, a yaw heading reference based on the IMU data is internally kept and updated by our control algorithm. We validate the architecture using an AR Drone 2.0 and the OpenTLD tracker in outdoor suburban areas. The experimental tests have shown robustness against wind perturbations, target occlusion and illumination changes, and the system's capability to track a great variety of objects present on suburban areas, for instance: walking or running people, windows, AC machines, static and moving cars and plants.
KW - Object Following
KW - Quadrotor Control
KW - UAV vision based control
KW - Visual Servoing
UR - http://www.scopus.com/inward/record.url?scp=84905717349&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84905717349&partnerID=8YFLogxK
U2 - 10.1109/ACC.2014.6858831
DO - 10.1109/ACC.2014.6858831
M3 - Conference contribution
AN - SCOPUS:84905717349
SN - 9781479932726
T3 - Proceedings of the American Control Conference
SP - 1886
EP - 1891
BT - 2014 American Control Conference, ACC 2014
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2014 American Control Conference, ACC 2014
Y2 - 4 June 2014 through 6 June 2014
ER -