The Role of Pauses and Entropy in Learning Nonadjacent Dependencies

Areti Kotsolakou*, Frank Wijnen, Sergey Avrutin

*Corresponding author for this work

Research output: Contribution to conferencePosterAcademic

Abstract

Acquiring language entails inferring rules from limited input. The Entropy Model (EM) states that abstract generalizations are gradually attained as entropy (input-complexity) exceeds channel capacity (CC; brain’s encoding power). Radulescu et al. (2019) confirmed this prediction using an AAB rule. We tested EM’s predictions on nonadjacent-dependencies (NADs), which frequently reflect morphosyntactic rules.
Adults were exposed online to a NAD-artificial-language (aXb: a predicts b; X varies), in two entropy conditions: Low/High. Presence/absence of pauses segmenting the stream into aXb-items was manipulated. Pauses reduce entropy but also guide attention to individual aXb-items, decreasing attentional resources and thus available CC for encoding distributional/structural information. A grammaticality-judgment task including grammatical/ungrammatical test-items with familiar or unfamiliar X-elements, tested NAD-detection or NAD-generalization over novel X-elements, respectively. Higher entropy and lower CC were predicted to promote NAD-generalization, whereas lower entropy NAD-detection.
NAD-detection and NAD-generalization (preference-rate for grammatical-over-ungrammatical test-items) were higher in pause vs no-pause conditions, as predicted. NAD-detection was higher than NAD-generalization, showing the graduality of generalization. Entropy however did not affect generalization. This may be due to the lack of cues enhancing generalization used in studies that found an entropy-effect. Furthermore, online testing allowed limited control over participants. Results are discussed in relation to the EM.
Original languageEnglish
Publication statusPublished - 1 Jun 2022

Bibliographical note

Funding Information:
This work was supported by the Netherlands Organization for Scientific Research (NWO PGW.20.001) .

Publisher Copyright:
Copyright © 2022 Kotsolakou, Wijnen, and Avrutin

Keywords

  • entropy
  • non adjacent dependcies
  • artificial learning grammar
  • rule induction
  • linguistic generalizations
  • category formation

Fingerprint

Dive into the research topics of 'The Role of Pauses and Entropy in Learning Nonadjacent Dependencies'. Together they form a unique fingerprint.

Cite this