Video activity recognition with varying rhythms

Bulent Ayhan, Chiman Kwan, Bence Budavari, Jude Larkin, David Gribben, Baoxin Li

Research output: Contribution to journalArticlepeer-review

8 Scopus citations

Abstract

Recognizing normal and anomalous events in long and complex videos with multiple sub-activities has received considerable attention in recent years. This task is more challenging than traditional action recognition in short and relatively homogeneous video clips. Other than the difficulty in recognizing activities in long videos, one other challenge is the varying activity rhythms. The rhythm of sub-actions in an activity can differ in nature and can pose additional challenges that affect the performance of activity recognition methods. In this article, five video activity recognition methods were evaluated using two publicly available video datasets, Breakfast and VIRAT, which consist of long and complex videos. Extensive experiments and analyses showed that among these methods, VideoGraph, was found to perform distinctly better than the other investigated methods while maintaining high accuracy even if the test videos were exposed to severe rhythm changes. The results indicated that VideoGraph is less sensitive to varying rhythms in contrast to other investigated methods. By changing some of the architecture parameters, we also observed performance improvements in VideoGraph.

Original languageEnglish (US)
Pages (from-to)191997-192008
Number of pages12
JournalIEEE Access
Volume8
DOIs
StatePublished - 2020

Keywords

  • Activity recognition
  • Event recognition
  • Human action recognition
  • Surveillance
  • Varying rhythm
  • Video

ASJC Scopus subject areas

  • General Computer Science
  • General Materials Science
  • General Engineering

Fingerprint

Dive into the research topics of 'Video activity recognition with varying rhythms'. Together they form a unique fingerprint.

Cite this