Article ID Journal Published Year Pages File Type
5041627 Cognition 2017 7 Pages PDF
Abstract

•Visual articulatory information affects word processing in infants.•Infants recognize mispronunciations when lip movements are consistent with a word.•Do not recognize the same mispronunciations in auditory-only context.•Implications for understanding early lexical processing architecture.

What do infants hear when they read lips? In the present study, twelve-to-thirteen-month-old infants viewed a talking face produce familiar and unfamiliar words. The familiar words were of three types: in Experiment 1, they were produced correctly (e.g., “bottle”); in Experiment 2, infants saw and heard mispronunciations in which the altered phoneme either visually resembled the original phoneme (visually consistent, e.g. “pottle”), or did not visually resemble the original phoneme (visually inconsistent, e.g., “dottle”). Infants in the correct and consistent conditions differentiated the familiar and unfamiliar words, but infants in the inconsistent condition did not. Experiment 3 confirms that infants were sensitive to the mispronunciations in the consistent condition with auditory-only words. Thus, although infants recognized the consistent mispronunciations when they saw a face articulating the words, they did not with the auditory information alone. These results provide the first evidence that visual articulatory information affects word processing in infants.

Related Topics
Life Sciences Neuroscience Cognitive Neuroscience
Authors
, ,