A dataset of continuous affect annotations and physiological signals for emotion analysis

Karan Sharma, Claudio Castellini, E.L. van den Broek, Alin Albu-Schaeffer, Friedhelm Schwenker

Research output: Contribution to journalArticleAcademicpeer-review

Abstract

From a computational viewpoint, emotions continue to be intriguingly hard to understand. In research, a direct and real-time inspection in realistic settings is not possible. Discrete, indirect, post-hoc recordings are therefore the norm. As a result, proper emotion assessment remains a problematic issue. The Continuously Annotated Signals of Emotion (CASE) dataset provides a solution as it focuses on real-time continuous annotation of emotions, as experienced by the participants, while watching various videos. For this purpose, a novel, intuitive joystick-based annotation interface was developed, that allowed for simultaneous reporting of valence and arousal, that are instead often annotated independently. In parallel, eight high quality, synchronized physiological recordings (1000Hz, 16-bit ADC) were obtained from ECG, BVP, EMG (3x), GSR (or EDA), respiration and skin temperature sensors. The dataset consists of the physiological and annotation data from 30 participants, 15 male and 15 female, who watched several validated video-stimuli. The validity of the emotion induction, as exemplified by the annotation and physiological data, is also presented.
Original languageEnglish
Article number196
Number of pages13
Journal Scientific data
Volume6
DOIs
Publication statusPublished - 9 Oct 2019

Keywords

  • continuous affect annotations
  • physiological signals
  • emotion
  • data set
  • ECG
  • BVP
  • EMG
  • GSR
  • EDA
  • joystick
  • interface

Fingerprint

Dive into the research topics of 'A dataset of continuous affect annotations and physiological signals for emotion analysis'. Together they form a unique fingerprint.

Cite this