Experiments on auditory-visual perception of sentences by users of unilateral, bimodal, and bilateral cochlear implants

Michael Dorman, Julie Liss, Shuai Wang, Visar Berisha, Cimarron Ludwig, Sarah Cook Natale

Research output: Contribution to journalArticle

15 Scopus citations

Abstract

Purpose: Five experiments probed auditory-visual (AV) understanding of sentences by users of cochlear implants (CIs). Method: Sentence material was presented in auditory (A), visual (V), and AV test conditions to listeners with normal hearing and CI users. Results: (a) Most CI users report that most of the time, they have access to both A and V information when listening to speech. (b) CI users did not achieve better scores on a task of speechreading than did listeners with normal hearing. (c) Sentences that are easy to speechread provided 12 percentage points more gain to speech understanding than did sentences that were difficult. (d) Ease of speechreading for sentences is related to phrase familiarity. (e) Users of bimodal CIs benefit from low-frequency acoustic hearing even when V cues are available, and a second CI adds to the benefit of a single CI when V cues are available. (f) V information facilitates lexical segmentation by improving the recognition of the number of syllables produced and the relative strength of these syllables. Conclusions: Our data are consistent with the view that V information improves CI users’ ability to identify syllables in the acoustic stream and to recognize their relative juxtaposed strengths. Enhanced syllable resolution allows better identification of word onsets, which, when combined with place-of-articulation information from visible consonants, improves lexical access.

Original languageEnglish (US)
Pages (from-to)1505-1519
Number of pages15
JournalJournal of Speech, Language, and Hearing Research
Volume59
Issue number6
DOIs
StatePublished - 2016

    Fingerprint

ASJC Scopus subject areas

  • Language and Linguistics
  • Linguistics and Language
  • Speech and Hearing

Cite this