The role of entropy in frame-based lexical categorization

Areti Kotsolakou*, Frank Wijnen, Sergey Avrutin

*Corresponding author for this work

Research output: Contribution to conferencePosterAcademic

Abstract

Words in natural languages are organized into grammatical categories. Mintz (2002) suggested that frames, -frequently occurring word-pairs that span an intermediate target-word-, facilitate categorization of this word. Artificial-language studies demonstrated that dense overlap of distributional cues (frames and adjacent dependencies) across target-words enhances category-learning (Reeder et al., 2013). However, category-learning was tested using only trained intermediate target-words, whereas in natural languages category-learning requires abstraction away from individual items.
We use the entropy model (Radulescu et al., 2020) to investigate generalization in frame-based categorization. This model provides a quantitative measure of input-complexity (entropy) and argues that abstract generalizations are gradually attained as entropy exceeds learner’s processing capacity. We suggest that abstract category-learning (generalization) requires high-entropy input.
Adults are exposed to an artificial language in a low-entropy vs a high-entropy condition (sparse vs dense frame/target-word overlap) and tested with grammaticality-judgments. Familiar/trained intervening target-words in novel category-conforming vs non-conforming combinations with frames, test item-specific category-learning. New/untrained intervening items in category-conforming vs non-conforming combinations with frames test abstract category-learning.
In line with our predictions, preliminary results suggest that both item-specific and abstract category-learning are higher in high-entropy. Furthermore, item-specific category-learning is higher than abstract category-learning in both conditions. This difference is greater in low-entropy.
Original languageEnglish
Number of pages1
Publication statusPublished - 5 Jun 2024

Funding

This work was supported by the Netherlands Organization for Scientific Research (NWO PGW.20.001) .

FundersFunder number
Not added

    Fingerprint

    Dive into the research topics of 'The role of entropy in frame-based lexical categorization'. Together they form a unique fingerprint.

    Cite this