Abstract
Musical patterns are salient passages that repeatedly appear in music. Such passages are vital for compression, classification and prediction tasks in MIR, and algorithms employing different techniques have been proposed to find musical patterns automatically. Human-annotated patterns have been collected and used to evaluate pattern discovery algorithms, e.g., in the Discovery of Repeated Themes & Sections MIREX task. However, state-of-the-art algorithms are not yet able to reproduce human-annotated patterns. To understand what gives rise to the discrepancy between algorithmically extracted patterns and human-annotated patterns, we use jSymbolic to extract features from patterns, visualise the feature space using PCA and perform a comparative analysis using classification techniques. We show that it is possible to classify algorithmically extracted patterns, human-annotated patterns and randomly sampled passages. This implies: (a) Algorithmically extracted patterns possess different properties than human-annotated patterns (b) Algorithmically extracted patterns have different structures than randomly sampled passages (c) Human-annotated patterns contain more information than randomly sampled passages despite subjectivity involved in the annotation process. We further discover that rhythmic features are of high importance in the classification process, which should influence future research on automatic pattern discovery.
Original language | English |
---|---|
Pages | 539-546 |
Number of pages | 8 |
Publication status | Published - 2018 |
Event | The 19th International Society for Music Information Retrieval Conference - Paris, France Duration: 23 Sept 2018 → 27 Sept 2018 |
Conference
Conference | The 19th International Society for Music Information Retrieval Conference |
---|---|
Country/Territory | France |
City | Paris |
Period | 23/09/18 → 27/09/18 |