Melody Retrieval and Classification Using Biologically-Inspired Techniques

    Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

    Abstract

    Retrieval and classification are at the center of Music Information Retrieval research. Both tasks rely on a method to assess the similarity between two music documents. In the context of symbolically encoded melodies, pairwise alignment via dynamic programming has been the most widely used method. However, this approach fails to scale-up well in terms of time complexity and insufficiently models the variance between melodies of the same class. Compact representations and indexing techniques that capture the salient and robust properties of music content, are increasingly important. We adapt two existing bioinformatics tools to improve the melody retrieval and classification tasks. On two datasets of folk tunes and cover song melodies, we apply the extremely fast indexing method of the Basic Local Alignment Search Tool (BLAST) and achieve comparable classification performance to exhaustive approaches. We increase retrieval performance and efficiency by using multiple sequence alignment algorithms for locating variation patterns and profile hidden Markov models for incorporating those patterns into a similarity model.
    Original languageEnglish
    Title of host publicationEvoMusArt 2017, 6th International Conference on Evolutionary and Biologically Inspired Music and Art
    Subtitle of host publicationComputational Intelligence in Music, Sound, Art and Design
    EditorsJoão Correia, Vic Ciesielski, Antonios Liapis
    PublisherSpringer
    Pages49-64
    ISBN (Electronic)978-3-319-55750-2
    ISBN (Print)978-3-319-55749-6
    DOIs
    Publication statusPublished - 2017

    Publication series

    NameLecture Notes in Computer Science
    Volume10198

    Fingerprint

    Dive into the research topics of 'Melody Retrieval and Classification Using Biologically-Inspired Techniques'. Together they form a unique fingerprint.

    Cite this