Article ID Journal Published Year Pages File Type
4355533 Hearing Research 2010 8 Pages PDF
Abstract

The present study used eight normal-hearing (NH) subjects, listening to acoustic cochlear implant (CI) simulations, to examine the effects of spectral shifting on speech recognition in noise. Speech recognition was measured using spectrally matched and shifted speech (vowels, consonants, and IEEE sentences), generated by 8-channel, sine-wave vocoder. Measurements were made in quiet and in noise (speech-shaped static noise and speech-babble at 5 dB signal-to-noise ratio). One spectral match condition and four spectral shift conditions were investigated: 2 mm, 3 mm, and 4 mm linear shift, and 3 mm shift with compression, in terms of cochlear distance. Results showed that speech recognition scores dropped because of noise and spectral shifting, and that the interactive effects of spectral shifting and background conditions depended on the degree/type of spectral shift, background conditions, and the speech test materials. There was no significant interaction between spectral shifting and two noise conditions for all speech test materials. However, significant interactions between linear spectral shifts and all background conditions were found in sentence recognition; significant interactions between spectral shift types and all background conditions were found in vowel recognition. Overall, the results suggest that tonotopic mismatch may affect performance of CI users in complex listening environments.

Research highlights►Linear spectral shifts interact with background conditions on sentence recognition. ►Spectral shift types interact with background conditions on vowel recognition. ►No interaction between spectral shifts and background conditions for consonants. ►Tonotopic information is more important for vowel recognition in noise. ►Tonotopic information maybe important for lexical access/segmentation in noise.

Related Topics
Life Sciences Neuroscience Sensory Systems
Authors
, ,