Automated multiple target detection and tracking in UAV videos

Hongwei Mao, Chenhui Yang, Glen P. Abousleman, Jennie Si

Research output: Chapter in Book/Report/Conference proceedingConference contribution

14 Citations (Scopus)

Abstract

In this paper, a novel system is presented to detect and track multiple targets in Unmanned Air Vehicles (UAV) video sequences. Since the output of the system is based on target motion, we first segment foreground moving areas from the background in each video frame using background subtraction. To stabilize the video, a multi-point-descriptor-based image registration method is performed where a projective model is employed to describe the global transformation between frames. For each detected foreground blob, an object model is used to describe its appearance and motion information. Rather than immediately classifying the detected objects as targets, we track them for a certain period of time and only those with qualified motion patterns are labeled as targets. In the subsequent tracking process, a Kalman filter is assigned to each tracked target to dynamically estimate its position in each frame. Blobs detected at a later time are used as observations to update the state of the tracked targets to which they are associated. The proposed overlap-rate-based data association method considers the splitting and merging of the observations, and therefore is able to maintain tracks more consistently. Experimental results demonstrate that the system performs well on real-world UAV video sequences. Moreover, careful consideration given to each component in the system has made the proposed system feasible for real-time applications.

Original languageEnglish (US)
Title of host publicationProceedings of SPIE - The International Society for Optical Engineering
Volume7668
DOIs
StatePublished - 2010
EventAirborne Intelligence, Surveillance, Reconnaissance (ISR) Systems and Applications VII - Orlando, FL, United States
Duration: Apr 7 2010Apr 8 2010

Other

OtherAirborne Intelligence, Surveillance, Reconnaissance (ISR) Systems and Applications VII
CountryUnited States
CityOrlando, FL
Period4/7/104/8/10

Fingerprint

Target Tracking
Target Detection
Target tracking
vehicles
Target
air
Image registration
Air
Merging
Kalman filters
Motion
Background Subtraction
Data Association
Object Model
Image Registration
classifying
Period of time
subtraction
Kalman Filter
Descriptors

Keywords

  • data association
  • Kalman filter
  • Target detection
  • target tracking
  • UAV

ASJC Scopus subject areas

  • Applied Mathematics
  • Computer Science Applications
  • Electrical and Electronic Engineering
  • Electronic, Optical and Magnetic Materials
  • Condensed Matter Physics

Cite this

Mao, H., Yang, C., Abousleman, G. P., & Si, J. (2010). Automated multiple target detection and tracking in UAV videos. In Proceedings of SPIE - The International Society for Optical Engineering (Vol. 7668). [76680J] https://doi.org/10.1117/12.849739

Automated multiple target detection and tracking in UAV videos. / Mao, Hongwei; Yang, Chenhui; Abousleman, Glen P.; Si, Jennie.

Proceedings of SPIE - The International Society for Optical Engineering. Vol. 7668 2010. 76680J.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Mao, H, Yang, C, Abousleman, GP & Si, J 2010, Automated multiple target detection and tracking in UAV videos. in Proceedings of SPIE - The International Society for Optical Engineering. vol. 7668, 76680J, Airborne Intelligence, Surveillance, Reconnaissance (ISR) Systems and Applications VII, Orlando, FL, United States, 4/7/10. https://doi.org/10.1117/12.849739
Mao H, Yang C, Abousleman GP, Si J. Automated multiple target detection and tracking in UAV videos. In Proceedings of SPIE - The International Society for Optical Engineering. Vol. 7668. 2010. 76680J https://doi.org/10.1117/12.849739
Mao, Hongwei ; Yang, Chenhui ; Abousleman, Glen P. ; Si, Jennie. / Automated multiple target detection and tracking in UAV videos. Proceedings of SPIE - The International Society for Optical Engineering. Vol. 7668 2010.
@inproceedings{0c2dce82a806493598fa5ec5f19cf835,
title = "Automated multiple target detection and tracking in UAV videos",
abstract = "In this paper, a novel system is presented to detect and track multiple targets in Unmanned Air Vehicles (UAV) video sequences. Since the output of the system is based on target motion, we first segment foreground moving areas from the background in each video frame using background subtraction. To stabilize the video, a multi-point-descriptor-based image registration method is performed where a projective model is employed to describe the global transformation between frames. For each detected foreground blob, an object model is used to describe its appearance and motion information. Rather than immediately classifying the detected objects as targets, we track them for a certain period of time and only those with qualified motion patterns are labeled as targets. In the subsequent tracking process, a Kalman filter is assigned to each tracked target to dynamically estimate its position in each frame. Blobs detected at a later time are used as observations to update the state of the tracked targets to which they are associated. The proposed overlap-rate-based data association method considers the splitting and merging of the observations, and therefore is able to maintain tracks more consistently. Experimental results demonstrate that the system performs well on real-world UAV video sequences. Moreover, careful consideration given to each component in the system has made the proposed system feasible for real-time applications.",
keywords = "data association, Kalman filter, Target detection, target tracking, UAV",
author = "Hongwei Mao and Chenhui Yang and Abousleman, {Glen P.} and Jennie Si",
year = "2010",
doi = "10.1117/12.849739",
language = "English (US)",
isbn = "9780819481320",
volume = "7668",
booktitle = "Proceedings of SPIE - The International Society for Optical Engineering",

}

TY - GEN

T1 - Automated multiple target detection and tracking in UAV videos

AU - Mao, Hongwei

AU - Yang, Chenhui

AU - Abousleman, Glen P.

AU - Si, Jennie

PY - 2010

Y1 - 2010

N2 - In this paper, a novel system is presented to detect and track multiple targets in Unmanned Air Vehicles (UAV) video sequences. Since the output of the system is based on target motion, we first segment foreground moving areas from the background in each video frame using background subtraction. To stabilize the video, a multi-point-descriptor-based image registration method is performed where a projective model is employed to describe the global transformation between frames. For each detected foreground blob, an object model is used to describe its appearance and motion information. Rather than immediately classifying the detected objects as targets, we track them for a certain period of time and only those with qualified motion patterns are labeled as targets. In the subsequent tracking process, a Kalman filter is assigned to each tracked target to dynamically estimate its position in each frame. Blobs detected at a later time are used as observations to update the state of the tracked targets to which they are associated. The proposed overlap-rate-based data association method considers the splitting and merging of the observations, and therefore is able to maintain tracks more consistently. Experimental results demonstrate that the system performs well on real-world UAV video sequences. Moreover, careful consideration given to each component in the system has made the proposed system feasible for real-time applications.

AB - In this paper, a novel system is presented to detect and track multiple targets in Unmanned Air Vehicles (UAV) video sequences. Since the output of the system is based on target motion, we first segment foreground moving areas from the background in each video frame using background subtraction. To stabilize the video, a multi-point-descriptor-based image registration method is performed where a projective model is employed to describe the global transformation between frames. For each detected foreground blob, an object model is used to describe its appearance and motion information. Rather than immediately classifying the detected objects as targets, we track them for a certain period of time and only those with qualified motion patterns are labeled as targets. In the subsequent tracking process, a Kalman filter is assigned to each tracked target to dynamically estimate its position in each frame. Blobs detected at a later time are used as observations to update the state of the tracked targets to which they are associated. The proposed overlap-rate-based data association method considers the splitting and merging of the observations, and therefore is able to maintain tracks more consistently. Experimental results demonstrate that the system performs well on real-world UAV video sequences. Moreover, careful consideration given to each component in the system has made the proposed system feasible for real-time applications.

KW - data association

KW - Kalman filter

KW - Target detection

KW - target tracking

KW - UAV

UR - http://www.scopus.com/inward/record.url?scp=77953571941&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=77953571941&partnerID=8YFLogxK

U2 - 10.1117/12.849739

DO - 10.1117/12.849739

M3 - Conference contribution

AN - SCOPUS:77953571941

SN - 9780819481320

VL - 7668

BT - Proceedings of SPIE - The International Society for Optical Engineering

ER -