The Credit Problem in parametric stress: A probabilistic approach

Aleksei Nazarov, Gaja Jarosz

Research output: Contribution to journalArticleAcademicpeer-review


In this paper, we introduce a novel domain-general, statistical learning model for P&P grammars: the Expectation Driven Parameter Learner (EDPL). We show that the EDPL provides a mathematically principled solution to the Credit Problem (Dresher 1999). We present the first systematic tests of the EDPL and an existing and closely related model, the Naïve Parameter Learner (NPL), on a full stress typology, the one generated by Dresher & Kaye’s (1990) stress parameter framework. This framework has figured prominently in the debate about the necessity of domain-specific mechanisms for learning of parametric stress. The essential difference between the two learning models is that the EDPL incorporates a mechanism that directly tackles the Credit Problem, while the NPL does not. We find that the NPL fails to cope with the ambiguity of this stress system both in terms of learning success and data complexity, while the EDPL performs well on both metrics. Based on these results, we argue that probabilistic inference provides a viable domain-general approach to parametric stress learning, but only when learning involves an inferential process that directly addresses the Credit Problem. We also present in-depth analyses of the learning outcomes, showing how learning outcomes depend crucially on the structural ambiguities posited by a particular phonological theory, and how these learning difficulties correspond to typological gaps.
Original languageEnglish
Pages (from-to)1-26
Issue number1
Publication statusPublished - 2021


  • phonology
  • word stress
  • learnability
  • Principles & Parameters
  • probabilistic learning
  • domain-general learning


Dive into the research topics of 'The Credit Problem in parametric stress: A probabilistic approach'. Together they form a unique fingerprint.

Cite this