Vision based collaborative localization for swarms of aerial vehicles

Sai Vemprala, Srikanth Saripalli

Research output: Contribution to journalArticle

3 Citations (Scopus)

Abstract

We present a framework for localizing a swarm of multirotor micro aerial vehicles (MAV) through collaboration using vision based sensing. For MAVs equipped with monocular cameras, this technique, built upon a relative pose estimation strategy between two or more cameras, enables the MAVs to share information of a common map and thus estimate accurate metric poses between each other even through fast motion and changing environments. Synchronized feature detection, matching and robust tracking enable the use of multiple view geometry concepts for performing the estimation. Furthermore, we present the implementation details of this technique followed by a set of results which involves evaluation of the accuracy of the pose estimates through test cases in both simulated and real experiments. Our test cases involve a group of quadrotors in simulation, as well as real world flight tests with two MAVs.

Original languageEnglish (US)
Pages (from-to)2980-2985
Number of pages6
JournalAnnual Forum Proceedings - AHS International
StatePublished - 2017

Fingerprint

Micro air vehicle (MAV)
Antennas
Cameras
Geometry
Experiments

ASJC Scopus subject areas

  • Engineering(all)

Cite this

Vision based collaborative localization for swarms of aerial vehicles. / Vemprala, Sai; Saripalli, Srikanth.

In: Annual Forum Proceedings - AHS International, 2017, p. 2980-2985.

Research output: Contribution to journalArticle

@article{2793bf65f02641baae42a4e3284791af,
title = "Vision based collaborative localization for swarms of aerial vehicles",
abstract = "We present a framework for localizing a swarm of multirotor micro aerial vehicles (MAV) through collaboration using vision based sensing. For MAVs equipped with monocular cameras, this technique, built upon a relative pose estimation strategy between two or more cameras, enables the MAVs to share information of a common map and thus estimate accurate metric poses between each other even through fast motion and changing environments. Synchronized feature detection, matching and robust tracking enable the use of multiple view geometry concepts for performing the estimation. Furthermore, we present the implementation details of this technique followed by a set of results which involves evaluation of the accuracy of the pose estimates through test cases in both simulated and real experiments. Our test cases involve a group of quadrotors in simulation, as well as real world flight tests with two MAVs.",
author = "Sai Vemprala and Srikanth Saripalli",
year = "2017",
language = "English (US)",
pages = "2980--2985",
journal = "Annual Forum Proceedings - AHS International",
issn = "1552-2938",
publisher = "American Helicopter Society",

}

TY - JOUR

T1 - Vision based collaborative localization for swarms of aerial vehicles

AU - Vemprala, Sai

AU - Saripalli, Srikanth

PY - 2017

Y1 - 2017

N2 - We present a framework for localizing a swarm of multirotor micro aerial vehicles (MAV) through collaboration using vision based sensing. For MAVs equipped with monocular cameras, this technique, built upon a relative pose estimation strategy between two or more cameras, enables the MAVs to share information of a common map and thus estimate accurate metric poses between each other even through fast motion and changing environments. Synchronized feature detection, matching and robust tracking enable the use of multiple view geometry concepts for performing the estimation. Furthermore, we present the implementation details of this technique followed by a set of results which involves evaluation of the accuracy of the pose estimates through test cases in both simulated and real experiments. Our test cases involve a group of quadrotors in simulation, as well as real world flight tests with two MAVs.

AB - We present a framework for localizing a swarm of multirotor micro aerial vehicles (MAV) through collaboration using vision based sensing. For MAVs equipped with monocular cameras, this technique, built upon a relative pose estimation strategy between two or more cameras, enables the MAVs to share information of a common map and thus estimate accurate metric poses between each other even through fast motion and changing environments. Synchronized feature detection, matching and robust tracking enable the use of multiple view geometry concepts for performing the estimation. Furthermore, we present the implementation details of this technique followed by a set of results which involves evaluation of the accuracy of the pose estimates through test cases in both simulated and real experiments. Our test cases involve a group of quadrotors in simulation, as well as real world flight tests with two MAVs.

UR - http://www.scopus.com/inward/record.url?scp=85029654846&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85029654846&partnerID=8YFLogxK

M3 - Article

AN - SCOPUS:85029654846

SP - 2980

EP - 2985

JO - Annual Forum Proceedings - AHS International

JF - Annual Forum Proceedings - AHS International

SN - 1552-2938

ER -