Conveying language through haptics: A multi-sensory approach

Nathan Dunkelberger, Jenny Sullivan, Joshua Bradley, Nickolas P. Walling, Indu Manickam, Gautam Dasarathy, Ali Israr, Frances W.Y. Lau, Keith Klumb, Brian Knott, Freddy Abnousi, Richard Baraniuk, Marcia K. O’Malley

Research output: Chapter in Book/Report/Conference proceedingConference contribution

10 Citations (Scopus)

Abstract

In our daily lives, we rely heavily on our visual and auditory channels to receive information from others. In the case of impairment, or when large amounts of information are already transmitted visually or aurally, alternative methods of communication are needed. A haptic language offers the potential to provide information to a user when visual and auditory channels are unavailable. Previously created haptic languages include deconstructing acoustic signals into features and displaying them through a haptic device, and haptic adaptations of Braille or Morse code; however, these approaches are unintuitive, slow at presenting language, or require a large surface area. We propose using a multi-sensory haptic device called MISSIVE, which can be worn on the upper arm and is capable of producing brief cues, sufficient in quantity to encode the full English phoneme set. We evaluated our approach by teaching subjects a subset of 23 phonemes, and demonstrated an 86% accuracy in a 50 word identification task after 100 minutes of training.

Original languageEnglish (US)
Title of host publicationISWC 2018 - Proceedings of the 2018 ACM International Symposium on Wearable Computers
PublisherAssociation for Computing Machinery
Pages25-32
Number of pages8
ISBN (Electronic)9781450359672
DOIs
StatePublished - Oct 8 2018
Externally publishedYes
Event22nd International Symposium on Wearable Computers, ISWC 2018 - Singapore, Singapore
Duration: Oct 8 2018Oct 12 2018

Publication series

NameProceedings - International Symposium on Wearable Computers, ISWC
ISSN (Print)1550-4816

Conference

Conference22nd International Symposium on Wearable Computers, ISWC 2018
CountrySingapore
CitySingapore
Period10/8/1810/12/18

Fingerprint

Conveying
Teaching
Acoustics
Communication

Keywords

  • Haptics
  • Multi-sensory
  • Speech.
  • Wearable

ASJC Scopus subject areas

  • Software
  • Hardware and Architecture
  • Computer Networks and Communications

Cite this

Dunkelberger, N., Sullivan, J., Bradley, J., Walling, N. P., Manickam, I., Dasarathy, G., ... O’Malley, M. K. (2018). Conveying language through haptics: A multi-sensory approach. In ISWC 2018 - Proceedings of the 2018 ACM International Symposium on Wearable Computers (pp. 25-32). (Proceedings - International Symposium on Wearable Computers, ISWC). Association for Computing Machinery. https://doi.org/10.1145/3267242.3267244

Conveying language through haptics : A multi-sensory approach. / Dunkelberger, Nathan; Sullivan, Jenny; Bradley, Joshua; Walling, Nickolas P.; Manickam, Indu; Dasarathy, Gautam; Israr, Ali; Lau, Frances W.Y.; Klumb, Keith; Knott, Brian; Abnousi, Freddy; Baraniuk, Richard; O’Malley, Marcia K.

ISWC 2018 - Proceedings of the 2018 ACM International Symposium on Wearable Computers. Association for Computing Machinery, 2018. p. 25-32 (Proceedings - International Symposium on Wearable Computers, ISWC).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Dunkelberger, N, Sullivan, J, Bradley, J, Walling, NP, Manickam, I, Dasarathy, G, Israr, A, Lau, FWY, Klumb, K, Knott, B, Abnousi, F, Baraniuk, R & O’Malley, MK 2018, Conveying language through haptics: A multi-sensory approach. in ISWC 2018 - Proceedings of the 2018 ACM International Symposium on Wearable Computers. Proceedings - International Symposium on Wearable Computers, ISWC, Association for Computing Machinery, pp. 25-32, 22nd International Symposium on Wearable Computers, ISWC 2018, Singapore, Singapore, 10/8/18. https://doi.org/10.1145/3267242.3267244
Dunkelberger N, Sullivan J, Bradley J, Walling NP, Manickam I, Dasarathy G et al. Conveying language through haptics: A multi-sensory approach. In ISWC 2018 - Proceedings of the 2018 ACM International Symposium on Wearable Computers. Association for Computing Machinery. 2018. p. 25-32. (Proceedings - International Symposium on Wearable Computers, ISWC). https://doi.org/10.1145/3267242.3267244
Dunkelberger, Nathan ; Sullivan, Jenny ; Bradley, Joshua ; Walling, Nickolas P. ; Manickam, Indu ; Dasarathy, Gautam ; Israr, Ali ; Lau, Frances W.Y. ; Klumb, Keith ; Knott, Brian ; Abnousi, Freddy ; Baraniuk, Richard ; O’Malley, Marcia K. / Conveying language through haptics : A multi-sensory approach. ISWC 2018 - Proceedings of the 2018 ACM International Symposium on Wearable Computers. Association for Computing Machinery, 2018. pp. 25-32 (Proceedings - International Symposium on Wearable Computers, ISWC).
@inproceedings{2bf9f676a9e446c2982899aa3d766fdb,
title = "Conveying language through haptics: A multi-sensory approach",
abstract = "In our daily lives, we rely heavily on our visual and auditory channels to receive information from others. In the case of impairment, or when large amounts of information are already transmitted visually or aurally, alternative methods of communication are needed. A haptic language offers the potential to provide information to a user when visual and auditory channels are unavailable. Previously created haptic languages include deconstructing acoustic signals into features and displaying them through a haptic device, and haptic adaptations of Braille or Morse code; however, these approaches are unintuitive, slow at presenting language, or require a large surface area. We propose using a multi-sensory haptic device called MISSIVE, which can be worn on the upper arm and is capable of producing brief cues, sufficient in quantity to encode the full English phoneme set. We evaluated our approach by teaching subjects a subset of 23 phonemes, and demonstrated an 86{\%} accuracy in a 50 word identification task after 100 minutes of training.",
keywords = "Haptics, Multi-sensory, Speech., Wearable",
author = "Nathan Dunkelberger and Jenny Sullivan and Joshua Bradley and Walling, {Nickolas P.} and Indu Manickam and Gautam Dasarathy and Ali Israr and Lau, {Frances W.Y.} and Keith Klumb and Brian Knott and Freddy Abnousi and Richard Baraniuk and O’Malley, {Marcia K.}",
year = "2018",
month = "10",
day = "8",
doi = "10.1145/3267242.3267244",
language = "English (US)",
series = "Proceedings - International Symposium on Wearable Computers, ISWC",
publisher = "Association for Computing Machinery",
pages = "25--32",
booktitle = "ISWC 2018 - Proceedings of the 2018 ACM International Symposium on Wearable Computers",

}

TY - GEN

T1 - Conveying language through haptics

T2 - A multi-sensory approach

AU - Dunkelberger, Nathan

AU - Sullivan, Jenny

AU - Bradley, Joshua

AU - Walling, Nickolas P.

AU - Manickam, Indu

AU - Dasarathy, Gautam

AU - Israr, Ali

AU - Lau, Frances W.Y.

AU - Klumb, Keith

AU - Knott, Brian

AU - Abnousi, Freddy

AU - Baraniuk, Richard

AU - O’Malley, Marcia K.

PY - 2018/10/8

Y1 - 2018/10/8

N2 - In our daily lives, we rely heavily on our visual and auditory channels to receive information from others. In the case of impairment, or when large amounts of information are already transmitted visually or aurally, alternative methods of communication are needed. A haptic language offers the potential to provide information to a user when visual and auditory channels are unavailable. Previously created haptic languages include deconstructing acoustic signals into features and displaying them through a haptic device, and haptic adaptations of Braille or Morse code; however, these approaches are unintuitive, slow at presenting language, or require a large surface area. We propose using a multi-sensory haptic device called MISSIVE, which can be worn on the upper arm and is capable of producing brief cues, sufficient in quantity to encode the full English phoneme set. We evaluated our approach by teaching subjects a subset of 23 phonemes, and demonstrated an 86% accuracy in a 50 word identification task after 100 minutes of training.

AB - In our daily lives, we rely heavily on our visual and auditory channels to receive information from others. In the case of impairment, or when large amounts of information are already transmitted visually or aurally, alternative methods of communication are needed. A haptic language offers the potential to provide information to a user when visual and auditory channels are unavailable. Previously created haptic languages include deconstructing acoustic signals into features and displaying them through a haptic device, and haptic adaptations of Braille or Morse code; however, these approaches are unintuitive, slow at presenting language, or require a large surface area. We propose using a multi-sensory haptic device called MISSIVE, which can be worn on the upper arm and is capable of producing brief cues, sufficient in quantity to encode the full English phoneme set. We evaluated our approach by teaching subjects a subset of 23 phonemes, and demonstrated an 86% accuracy in a 50 word identification task after 100 minutes of training.

KW - Haptics

KW - Multi-sensory

KW - Speech.

KW - Wearable

UR - http://www.scopus.com/inward/record.url?scp=85056821369&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85056821369&partnerID=8YFLogxK

U2 - 10.1145/3267242.3267244

DO - 10.1145/3267242.3267244

M3 - Conference contribution

AN - SCOPUS:85056821369

T3 - Proceedings - International Symposium on Wearable Computers, ISWC

SP - 25

EP - 32

BT - ISWC 2018 - Proceedings of the 2018 ACM International Symposium on Wearable Computers

PB - Association for Computing Machinery

ER -