Detecting phone-related pedestrian distracted behaviours via a two-branch convolutional neural network

Humberto Saenz, Huiming Sun, Lingtao Wu, Xuesong Zhou, Hongkai Yu

Research output: Contribution to journalArticlepeer-review

Abstract

The distracted phone-use behaviours among pedestrians, like Texting, Game Playing and Phone Calls, have caused increasing fatalities and injuries. However, the research of phone-related distracted behaviour by pedestrians has not been systemically studied. It is desired to improve both the driving and pedestrian safety by automatically discovering the phone-related pedestrian distracted behaviours. Herein, a new computer vision-based method is proposed to detect the phone-related pedestrian distracted behaviours from a view of intelligent and autonomous driving. Specifically, the first end-to-end deep learning based Two-Branch Convolutional Neural Network (CNN) is designed for this task. Taking one synchronised image pair by two front on-car GoPro cameras as the inputs, the proposed two-branch CNN will extract features for each camera, fuse the extracted features and perform a robust classification. This method can also be easily extended to video-based classification by confidence accumulation and voting. A new benchmark dataset of 448 synchronised video pairs of 53,760 images collected on a vehicle is proposed for this research. The experimental results show that using two synchronised cameras obtained better performance than using one single camera. Finally, the proposed method achieved an overall best classification accuracy of 84.3% on the new benchmark when compared to other methods.

Original languageEnglish (US)
JournalIET Intelligent Transport Systems
DOIs
StateAccepted/In press - 2020

ASJC Scopus subject areas

  • Transportation
  • Environmental Science(all)
  • Mechanical Engineering
  • Law

Fingerprint

Dive into the research topics of 'Detecting phone-related pedestrian distracted behaviours via a two-branch convolutional neural network'. Together they form a unique fingerprint.

Cite this