Article ID Journal Published Year Pages File Type
10458487 Consciousness and Cognition 2013 11 Pages PDF
Abstract
Previous research has established that people can implicitly learn chunks, which (in terms of formal language theory) do not require a memory buffer to process. The present study explores the implicit learning of nonlocal dependencies generated by higher than finite-state grammars, specifically, Chinese tonal retrogrades (i.e. centre embeddings generated from a context-free grammar) and inversions (i.e. cross-serial dependencies generated from a mildly context-sensitive grammar), which do require buffers (for example, last in-first out and first in-first out, respectively). People were asked to listen to and memorize artificial poetry instantiating one of the two grammars; after this training phase, people were informed of the existence of rules and asked to classify new poems, while providing attributions of the basis of their judgments. People acquired unconscious structural knowledge of both tonal retrogrades and inversions. Moreover, inversions were implicitly learnt more easily than retrogrades constraining the nature of the memory buffer in computational models of implicit learning.
Related Topics
Life Sciences Neuroscience Cognitive Neuroscience
Authors
, , , , ,