Evaluating discourse annotation: Some recent insights and new approaches

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

25 Downloads (Pure)

Abstract

Annotated data is an important resource for the linguistics community, which is why researchers need to be sure that such data are reliable. However, arriving at sufficiently reliable annotations appears to be an issue within the field of discourse, possibly due to the fact that coherence is a mental phenomenon rather than a textual one. In this paper, we discuss recent insights and developments regarding annotation and reliability evaluation that are relevant to the field of discourse. We focus on characteristics of coherence that impact reliability scores and look at how different measures are affected by this. We discuss benefits and disadvantages of these measures, and propose that discourse annotation results be accompanied by a detailed report of the annotation process and data, as well as a careful consideration of the reliability measure that is applied.
Original languageEnglish
Title of host publicationProceedings 13th Joint ISO - ACL Workshop on Interoperable Semantic Annotation (isa-13)
Pages1-13
Publication statusPublished - 2017

Fingerprint

Dive into the research topics of 'Evaluating discourse annotation: Some recent insights and new approaches'. Together they form a unique fingerprint.

Cite this