کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
931769 | 1474634 | 2015 | 18 صفحه PDF | دانلود رایگان |

• Lexical context and lip-read context affect early auditory speech processing.
• We used ERPs to investigate whether these two context operate together or not.
• Lexical context and lip-read context both affected auditory processing.
• We found no evidence of cross-talk between the two types of speech context.
• Lexical and lip-read context may thus have different neural bases and purposes.
Electrophysiological research has shown that pseudowords elicit more negative Event-Related Potentials (i.e., ERPs) than words within 250 ms after the lexical status of a speech token is defined (e.g., after hearing the onset of “ga” in the Spanish word “lechuga”, versus “da” in the pseudoword “lechuda”). Since lip-read context also affects speech sound processing within this time frame, we investigated whether these two context effects on speech perception operate together. We measured ERPs while listeners were presented with auditory-only, audiovisual, or lip-read-only stimuli, in which the critical syllable that determined lexical status was naturally-timed (Experiment 1) or delayed by ∼800 ms (Experiment 2). We replicated the electrophysiological effect of stimulus lexicality, and also observed substantial effects of audiovisual speech integration for words and pseudowords. Critically, we found several early time-windows (<400 ms) in which both contexts influenced auditory processes, but we never observed any cross-talk between the two types of speech context. The absence of any interaction between the two types of speech context supports the view that lip-read and lexical context mainly function separately, and may have different neural bases and purposes.
Journal: Journal of Memory and Language - Volume 85, November 2015, Pages 42–59