Semantic Web Journal, Special issue on Deep Learning for Knowledge Graphs, IOSPress, 14 September 2021
An important problem in large symbolic music collections is the low availability of high-quality metadata, which is essential for various information retrieval tasks. Traditionally, systems have addressed this by relying either on costly human annotations or on rule-based systems at a limited scale. Recently, embedding strategies have been exploited for representing latent factors in graphs of connected nodes. In this work, we propose MIDI2vec, a new approach for representing MIDI files as vectors based on graph embedding techniques. Our strategy consists of representing the MIDI data as a graph, including the information about tempo, time signature, programs and notes. Next, we run and optimise node2vec for generating embeddings using random walks in the graph. We demonstrate that the resulting vectors can successfully be employed for predicting the musical genre and other metadata such as the composer, the instrument or the movement. In particular, we conduct experiments using those vectors as input to a Feed-Forward Neural Network and we report good comparable accuracy scores in the prediction with respect to other approaches relying purely on symbolic music, avoiding feature engineering and producing highly scalable and reusable models with low dimensionality. Our proposal has real-world applications in automated metadata tagging for symbolic music, for example in digital libraries for musicology, datasets for machine learning, and knowledge graph completion.
Type:
Journal
Date:
2021-09-14
Department:
Data Science
Eurecom Ref:
6680
Copyright:
© IOS Press. Personal use of this material is permitted. The definitive version of this paper was published in Semantic Web Journal, Special issue on Deep Learning for Knowledge Graphs, IOSPress, 14 September 2021 and is available at : http://dx.doi.org/10.3233/SW-210446
See also: