کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
925235 1474026 2016 11 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Matching heard and seen speech: An ERP study of audiovisual word recognition
ترجمه فارسی عنوان
تطبیق سخنرانی شنیده شده و دیده شده: مطالعه ERP درباره تشخیص کلمه سمعی و بصری
موضوعات مرتبط
علوم زیستی و بیوفناوری علم عصب شناسی روانپزشکی بیولوژیکی
چکیده انگلیسی


• Matching silent articulations with heard words elicits N400 and LPC ERP components.
• N400 is larger to incongruent articulations and reflects pre-lexical matching.
• LPC is larger to congruent articulations and indexes articulatory word recognition.
• Only N400 amplitude is predictive of SIN improvement in the AV condition.

Seeing articulatory gestures while listening to speech-in-noise (SIN) significantly improves speech understanding. However, the degree of this improvement varies greatly among individuals. We examined a relationship between two distinct stages of visual articulatory processing and the SIN accuracy by combining a cross-modal repetition priming task with ERP recordings. Participants first heard a word referring to a common object (e.g., pumpkin) and then decided whether the subsequently presented visual silent articulation matched the word they had just heard. Incongruent articulations elicited a significantly enhanced N400, indicative of a mismatch detection at the pre-lexical level. Congruent articulations elicited a significantly larger LPC, indexing articulatory word recognition. Only the N400 difference between incongruent and congruent trials was significantly correlated with individuals’ SIN accuracy improvement in the presence of the talker’s face.

ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Brain and Language - Volumes 157–158, June–July 2016, Pages 14–24
نویسندگان
, , ,