Generalizing French Schwa Deletion: the Role of Indexed Constraints

Aleksei Nazarov, Brian Smith

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademic

Abstract

Indexed constraints (like cophonologies) increase a grammar’s fit to seen data, but do they hurt the grammar’s ability to generalize to unseen data? We focus on French schwa deletion, an optional process whose rate of application is modulated by both phonological and lexical factors, and we propose three indexed constraint learners in the Maximum Entropy (MaxEnt) framework. Using data from Racine (2008), we test the ability of four learners to capture existing patterns and generalize to unseen data: three learners and a control MaxEnt learner without indexed constraint induction. The Indexed constraint learners indeed lead to better fit to the training data compared to the control. The resulting grammars are tested on a different schwa deletion dataset from Smith & Pater (2020). It is shown that indexed constraints do not lead to a drop in generalization to these data, and one of the indexation learners produces a grammar that predicts Smith & Pater’s data quite closely. We conclude that indexed constraints do not necessarily hurt a grammar’s ability to generalize to unseen data, while allowing the grammar to achieve a closer fit to training data.
Original languageEnglish
Title of host publicationProceedings of the 2022 Annual Meeting on Phonology
EditorsNoah Elkin, Bruce Hayes, Jinyoung Jo, Jian-Leat Siah
PublisherLinguistic Society of America
Pages1-12
Number of pages12
DOIs
Publication statusPublished - 13 May 2023

Keywords

  • indexed constraints
  • French
  • schwa deletion
  • optionality
  • variation
  • lexically specific processes
  • generalizability

Fingerprint

Dive into the research topics of 'Generalizing French Schwa Deletion: the Role of Indexed Constraints'. Together they form a unique fingerprint.

Cite this