Article ID Journal Published Year Pages File Type
7302625 Neuroscience & Biobehavioral Reviews 2017 9 Pages PDF
Abstract
Artificial grammar learning is a popular paradigm to study syntactic ability in nonhuman animals. Subjects are first trained to recognize strings of tokens that are sequenced according to grammatical rules. Next, to test if recognition depends on grammaticality, subjects are presented with grammar-consistent and grammar-violating test strings, which they should discriminate between. However, simpler cues may underlie discrimination if they are available. Here, we review stimulus design in a sample of studies that use particular sounds as tokens, and that claim or suggest their results demonstrate a form of sequence rule learning. To assess the extent of acoustic similarity between training and test strings, we use four simple measures corresponding to cues that are likely salient. All stimulus sets contain biases in similarity measures such that grammatical test stimuli resemble training stimuli acoustically more than do non-grammatical test stimuli. These biases may contribute to response behaviour, reducing the strength of grammatical explanations. We conclude that acoustic confounds are a blind spot in artificial grammar learning studies in nonhuman animals.
Related Topics
Life Sciences Neuroscience Behavioral Neuroscience
Authors
, , , ,