Computational modeling of induced emotion using GEMS

    Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

    Abstract

    Most researchers in the automatic music emotion recognition
    field focus on the two-dimensional valence and arousal
    model. This model though does not account for the whole
    diversity of emotions expressible through music. Moreover,
    in many cases it might be important to model induced
    (felt) emotion, rather than perceived emotion. In
    this paper we explore a multidimensional emotional space,
    the Geneva Emotional Music Scales (GEMS), which addresses
    these two issues. We collected the data for our
    study using a game with a purpose. We exploit a comprehensive
    set of features from several state-of-the-art toolboxes
    and propose a new set of harmonically motivated
    features. The performance of these feature sets is compared.
    Additionally, we use expert human annotations to
    explore the dependency between musicologically meaningful
    characteristics of music and emotional categories of
    GEMS, demonstrating the need for algorithms that can better
    approximate human perception
    Original languageEnglish
    Title of host publicationProceedings of the 15th Conference of the International Society for Music Information Retrieval (ISMIR 2014)
    Pages373-378
    Publication statusPublished - 2014
    EventInternational Society for Music Information Retrieval - , Taiwan, Province of China
    Duration: 27 Oct 201431 Oct 2014

    Conference

    ConferenceInternational Society for Music Information Retrieval
    Country/TerritoryTaiwan, Province of China
    Period27/10/1431/10/14

    Fingerprint

    Dive into the research topics of 'Computational modeling of induced emotion using GEMS'. Together they form a unique fingerprint.

    Cite this