Article ID Journal Published Year Pages File Type
911759 Journal of Neurolinguistics 2014 21 Pages PDF
Abstract

•We examine how gesture modulates congruency effect for spatial demonstrative use.•Reaction times show similar congruency effects for English and Japanese speakers.•Both subject groups show N400 responses to trials with co-speech pointing gesture.•An English-specific P600 congruency effect is found in the no-gesture trials.•Therefore, brain processing of co-speech gesture is shaped by the language system.

This electrophysiological study investigated the relationship between language and nonverbal socio-spatial context for demonstrative use in speech communication. Adult participants from an English language group and a Japanese language group were asked to make congruency judgment for simultaneous presentation of an audio demonstrative phrase in their native language and a picture that included two human figures as speaker and hearer, as well as a referent object in different spatial arrangements. The demonstratives (“this” and “that” in English, and “ko,” “so,” and “a” in Japanese) were varied for the visual scenes to produce expected and unexpected combinations to refer to an object based on its relative spatial distances to the speaker and hearer. Half of the trials included an accompanying pointing gesture in the picture, and the other half did not. Behavioral data showed robust congruency effects with longer reaction time for the incongruent trials in both subject groups irrespective of the presence or absence of the pointing gesture. Both subject groups also showed a significant N400-like congruency effect in the event-related potential responses for the gesture trials, a finding predicted from previous work (Stevens & Zhang, 2013). In the no-gesture trials, the English data alone showed a P600 congruency effect preceded by a negative deflection. These results provide evidence for shared brain mechanisms for processing demonstrative expression congruency, as well as language-specific neural sensitivity to encoding the co-expressivity of gesture and speech.

Related Topics
Life Sciences Neuroscience Cognitive Neuroscience
Authors
, ,