کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
925615 | 921511 | 2011 | 6 صفحه PDF | دانلود رایگان |

We investigated whether and when information conveyed by spoken language impacts on the processing of visually presented objects. In contrast to traditional views, grounded-cognition posits direct links between language comprehension and perceptual processing. We used a magnetoencephalographic cross-modal priming paradigm to disentangle these views. In a sentence-picture verification task, pictures (e.g. of a flying duck) were paired with three sentence conditions: A feature-matching sentence about a duck in the air, a feature-mismatching sentence about a duck in a lake, and an unrelated sentence. Brain responses to pictures showed enhanced activity in the N400 time-window for the unrelated compared to both related conditions in the left temporal lobe. The M1 time-window revealed more activation for the feature-matching than for the other two conditions in the occipital cortex. These dissociable effects on early visual processing and semantic integration support models in which language comprehension engages two complementary systems, a perceptual and an abstract one.
Journal: Brain and Language - Volume 116, Issue 2, February 2011, Pages 91–96