Some properties of mental speech preparation as revealed by self-monitoring

Hugo Quené*, Sieb Nooteboom

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

Abstract

The main goal of this paper is to improve our insight in the mental preparation of speech, based on speakers' self-monitoring behavior. To this end we re-analyze the aggregated responses from earlier published experiments eliciting speech sound errors. The re-analyses confirm or show that (1) “early” and “late” detections of elicited speech sound errors can be distinguished, with a time delay in the order of 500 ms; (2) a main cause for some errors to be detected “early”, others “late” and others again not at all is the size of the phonetic contrast between the error and the target speech sound; (3) repairs of speech sound errors stem from competing (and sometimes active) word candidates. These findings lead to some speculative conclusions regarding the mental preparation of speech. First, there are two successive stages of mental preparation, an “early” and a “late” stage. Second, at the “early” stage of speech preparation, speech sounds are represented as targets in auditory perceptual space, at the “late” stage as coordinated motor commands necessary for articulation. Third, repairs of speech sound errors stem from response candidates competing for the same slot with the error form, and some activation often is sustained until after articulation.

Original languageEnglish
Article number103043
JournalSpeech Communication
Volume158
DOIs
Publication statusPublished - 9 Feb 2024

Bibliographical note

Publisher Copyright:
© 2024 The Author(s)

Keywords

  • Auditory perceptual targets
  • Motor commands
  • Self-monitoring
  • Speech errors
  • Speech preparation

Fingerprint

Dive into the research topics of 'Some properties of mental speech preparation as revealed by self-monitoring'. Together they form a unique fingerprint.
  • Mental speech preparation

    Quené, H. & Nooteboom, S., 17 Feb 2024

    Research output: Non-textual formData set/DatabaseAcademic

    Open Access

Cite this