Article ID Journal Published Year Pages File Type
925852 Brain and Language 2007 12 Pages PDF
Abstract

The present study investigates whether knowledge about the intentional relationship between gesture and speech influences controlled processes when integrating the two modalities at comprehension. Thirty-five adults watched short videos of gesture and speech that conveyed semantically congruous and incongruous information. In half of the videos, participants were told that the two modalities were intentionally coupled (i.e., produced by the same communicator), and in the other half, they were told that the two modalities were not intentionally coupled (i.e., produced by different communicators). When participants knew that the same communicator produced the speech and gesture, there was a larger bi-lateral frontal and central N400 effect to words that were semantically incongruous versus congruous with gesture. However, when participants knew that different communicators produced the speech and gesture—that is, when gesture and speech were not intentionally meant to go together—the N400 effect was present only in right-hemisphere frontal regions. The results demonstrate that pragmatic knowledge about the intentional relationship between gesture and speech modulates controlled neural processes during the integration of the two modalities.

Related Topics
Life Sciences Neuroscience Biological Psychiatry
Authors
, , , ,