Abstract
Indexed constraints (like cophonologies) increase a grammar’s fit to seen data, but do they hurt the grammar’s ability to generalize to unseen data? We focus on French schwa deletion, an optional process whose rate of application is modulated by both phonological and lexical factors, and we propose three indexed constraint learners in the Maximum Entropy (MaxEnt) framework. Using data from Racine (2008), we test the ability of four learners to capture existing patterns and generalize to unseen data: three learners and a control MaxEnt learner without indexed constraint induction. The Indexed constraint learners indeed lead to better fit to the training data compared to the control. The resulting grammars are tested on a different schwa deletion dataset from Smith & Pater (2020). It is shown that indexed constraints do not lead to a drop in generalization to these data, and one of the indexation learners produces a grammar that predicts Smith & Pater’s data quite closely. We conclude that indexed constraints do not necessarily hurt a grammar’s ability to generalize to unseen data, while allowing the grammar to achieve a closer fit to training data.
Original language | English |
---|---|
Title of host publication | Proceedings of the 2022 Annual Meeting on Phonology |
Editors | Noah Elkin, Bruce Hayes, Jinyoung Jo, Jian-Leat Siah |
Publisher | Linguistic Society of America |
Pages | 1-12 |
Number of pages | 12 |
DOIs | |
Publication status | Published - 13 May 2023 |
Keywords
- indexed constraints
- French
- schwa deletion
- optionality
- variation
- lexically specific processes
- generalizability