کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
926501 | 921873 | 2012 | 11 صفحه PDF | دانلود رایگان |

Bilinguals have been shown to activate their two languages in parallel, and this process can often be attributed to overlap in input between the two languages. The present study examines whether two languages that do not overlap in input structure, and that have distinct phonological systems, such as American Sign Language (ASL) and English, are also activated in parallel. Hearing ASL-English bimodal bilinguals’ and English monolinguals’ eye-movements were recorded during a visual world paradigm, in which participants were instructed, in English, to select objects from a display. In critical trials, the target item appeared with a competing item that overlapped with the target in ASL phonology. Bimodal bilinguals looked more at competing item than at phonologically unrelated items and looked more at competing items relative to monolinguals, indicating activation of the sign-language during spoken English comprehension. The findings suggest that language co-activation is not modality specific, and provide insight into the mechanisms that may underlie cross-modal language co-activation in bimodal bilinguals, including the role that top-down and lateral connections between levels of processing may play in language comprehension.
► We examine language processing in hearing English-American Sign Language bilinguals.
► We use eye-tracking to measure ASL activation during spoken English comprehension.
► English-ASL bilinguals activated both of their languages when listening to English.
► Bilinguals show high interactivity across languages, modalities, types of processing.
► Top-down and lateral information influence language activation.
Journal: Cognition - Volume 124, Issue 3, September 2012, Pages 314–324