Abstract
Artificial grammar learning is a popular paradigm to study syntactic ability in nonhuman animals. Subjects are first trained to recognize strings of tokens that are sequenced according to grammatical rules. Next, to test if recognition depends on grammaticality, subjects are presented with grammar-consistent and grammar-violating test strings, which they should discriminate between. However, simpler cues may underlie discrimination if they are available. Here, we review stimulus design in a sample of studies that use particular sounds as tokens, and that claim or suggest their results demonstrates a form of sequence rule learning. To assess the extent of acoustic similarity between training and test strings, we use four simple measures corresponding to cues that are likely salient. All stimulus sets contain biases in similarity measures such that grammatical test stimuli resemble training stimuli acoustically more than do non-grammatical test stimuli. These biases may contribute to response behaviour, reducing the strength of grammatical explanations. We conclude that acoustic confounds are a blind spot in artificial grammar learning studies in nonhuman animals.
Original language | English |
---|---|
Pages (from-to) | 238-246 |
Journal | Neuroscience and Biobehavioral Reviews |
Volume | 81 |
Issue number | Part B |
DOIs | |
Publication status | Published - Oct 2017 |
Keywords
- Animal cognition
- Artificial grammar learning
- Auditory memory
- Biolinguistics
- Bird
- Rule learning
- Primate
- Syntax