Abstract

Keystroke inference attacks pose an increasing threat to ubiquitous mobile devices. This paper presents EyeTell, a novel video-assisted attack that can infer a victim's keystrokes on his touchscreen device from a video capturing his eye movements. EyeTell explores the observation that human eyes naturally focus on and follow the keys they type, so a typing sequence on a soft keyboard results in a unique gaze trace of continuous eye movements. In contrast to prior work, EyeTell requires neither the attacker to visually observe the victim's inputting process nor the victim device to be placed on a static holder. Comprehensive experiments on iOS and Android devices confirm the high efficacy of EyeTell for inferring PINs, lock patterns, and English words under various environmental conditions.

Original languageEnglish (US)
Title of host publicationProceedings - 2018 IEEE Symposium on Security and Privacy, SP 2018
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages144-160
Number of pages17
Volume2018-May
ISBN (Electronic)9781538643525
DOIs
StatePublished - Jul 23 2018
Event39th IEEE Symposium on Security and Privacy, SP 2018 - San Francisco, United States
Duration: May 21 2018May 23 2018

Other

Other39th IEEE Symposium on Security and Privacy, SP 2018
CountryUnited States
CitySan Francisco
Period5/21/185/23/18

Fingerprint

Touch screens
Eye movements
Mobile devices
Experiments

Keywords

  • keystroke inference
  • mobile devices
  • security
  • video analysis

ASJC Scopus subject areas

  • Safety, Risk, Reliability and Quality
  • Software
  • Computer Networks and Communications

Cite this

Chen, Y., Li, T., Zhang, R., Zhang, Y., & Hedgpeth, T. (2018). EyeTell: Video-Assisted Touchscreen Keystroke Inference from Eye Movements. In Proceedings - 2018 IEEE Symposium on Security and Privacy, SP 2018 (Vol. 2018-May, pp. 144-160). [8418601] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/SP.2018.00010

EyeTell : Video-Assisted Touchscreen Keystroke Inference from Eye Movements. / Chen, Yimin; Li, Tao; Zhang, Rui; Zhang, Yanchao; Hedgpeth, Terri.

Proceedings - 2018 IEEE Symposium on Security and Privacy, SP 2018. Vol. 2018-May Institute of Electrical and Electronics Engineers Inc., 2018. p. 144-160 8418601.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Chen, Y, Li, T, Zhang, R, Zhang, Y & Hedgpeth, T 2018, EyeTell: Video-Assisted Touchscreen Keystroke Inference from Eye Movements. in Proceedings - 2018 IEEE Symposium on Security and Privacy, SP 2018. vol. 2018-May, 8418601, Institute of Electrical and Electronics Engineers Inc., pp. 144-160, 39th IEEE Symposium on Security and Privacy, SP 2018, San Francisco, United States, 5/21/18. https://doi.org/10.1109/SP.2018.00010
Chen Y, Li T, Zhang R, Zhang Y, Hedgpeth T. EyeTell: Video-Assisted Touchscreen Keystroke Inference from Eye Movements. In Proceedings - 2018 IEEE Symposium on Security and Privacy, SP 2018. Vol. 2018-May. Institute of Electrical and Electronics Engineers Inc. 2018. p. 144-160. 8418601 https://doi.org/10.1109/SP.2018.00010
Chen, Yimin ; Li, Tao ; Zhang, Rui ; Zhang, Yanchao ; Hedgpeth, Terri. / EyeTell : Video-Assisted Touchscreen Keystroke Inference from Eye Movements. Proceedings - 2018 IEEE Symposium on Security and Privacy, SP 2018. Vol. 2018-May Institute of Electrical and Electronics Engineers Inc., 2018. pp. 144-160
@inproceedings{11416f36bc2b4b2e894fb7e2f76e3f2a,
title = "EyeTell: Video-Assisted Touchscreen Keystroke Inference from Eye Movements",
abstract = "Keystroke inference attacks pose an increasing threat to ubiquitous mobile devices. This paper presents EyeTell, a novel video-assisted attack that can infer a victim's keystrokes on his touchscreen device from a video capturing his eye movements. EyeTell explores the observation that human eyes naturally focus on and follow the keys they type, so a typing sequence on a soft keyboard results in a unique gaze trace of continuous eye movements. In contrast to prior work, EyeTell requires neither the attacker to visually observe the victim's inputting process nor the victim device to be placed on a static holder. Comprehensive experiments on iOS and Android devices confirm the high efficacy of EyeTell for inferring PINs, lock patterns, and English words under various environmental conditions.",
keywords = "keystroke inference, mobile devices, security, video analysis",
author = "Yimin Chen and Tao Li and Rui Zhang and Yanchao Zhang and Terri Hedgpeth",
year = "2018",
month = "7",
day = "23",
doi = "10.1109/SP.2018.00010",
language = "English (US)",
volume = "2018-May",
pages = "144--160",
booktitle = "Proceedings - 2018 IEEE Symposium on Security and Privacy, SP 2018",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

TY - GEN

T1 - EyeTell

T2 - Video-Assisted Touchscreen Keystroke Inference from Eye Movements

AU - Chen, Yimin

AU - Li, Tao

AU - Zhang, Rui

AU - Zhang, Yanchao

AU - Hedgpeth, Terri

PY - 2018/7/23

Y1 - 2018/7/23

N2 - Keystroke inference attacks pose an increasing threat to ubiquitous mobile devices. This paper presents EyeTell, a novel video-assisted attack that can infer a victim's keystrokes on his touchscreen device from a video capturing his eye movements. EyeTell explores the observation that human eyes naturally focus on and follow the keys they type, so a typing sequence on a soft keyboard results in a unique gaze trace of continuous eye movements. In contrast to prior work, EyeTell requires neither the attacker to visually observe the victim's inputting process nor the victim device to be placed on a static holder. Comprehensive experiments on iOS and Android devices confirm the high efficacy of EyeTell for inferring PINs, lock patterns, and English words under various environmental conditions.

AB - Keystroke inference attacks pose an increasing threat to ubiquitous mobile devices. This paper presents EyeTell, a novel video-assisted attack that can infer a victim's keystrokes on his touchscreen device from a video capturing his eye movements. EyeTell explores the observation that human eyes naturally focus on and follow the keys they type, so a typing sequence on a soft keyboard results in a unique gaze trace of continuous eye movements. In contrast to prior work, EyeTell requires neither the attacker to visually observe the victim's inputting process nor the victim device to be placed on a static holder. Comprehensive experiments on iOS and Android devices confirm the high efficacy of EyeTell for inferring PINs, lock patterns, and English words under various environmental conditions.

KW - keystroke inference

KW - mobile devices

KW - security

KW - video analysis

UR - http://www.scopus.com/inward/record.url?scp=85051010014&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85051010014&partnerID=8YFLogxK

U2 - 10.1109/SP.2018.00010

DO - 10.1109/SP.2018.00010

M3 - Conference contribution

AN - SCOPUS:85051010014

VL - 2018-May

SP - 144

EP - 160

BT - Proceedings - 2018 IEEE Symposium on Security and Privacy, SP 2018

PB - Institute of Electrical and Electronics Engineers Inc.

ER -