Guitar-Set, a New Dataset for Music Information Retrieval
Juan P. Bello, NYU CDS affiliate, works on building a new dataset of high-quality annotated guitar performances
Rock-n-roll. Classical. Country. Blues. Punk. Pop. The guitar can do it all. Its substantial range attracts multitudinous musicians and listeners, and it provides engaging material for music research. But for researchers in the music information retrieval (MIR) community, the guitar’s versatility can also complicate analytic processes.
To facilitate guitar-related music research, CDS affiliated faculty member Juan P. Bello, Associate Professor of Music and Music Education, is developing a new dataset, called Guitar-Set, that consists of recorded guitar performances with detailed annotations. Many existing methods for automated analysis of guitar recordings depend on high-quality labeled data-sets, but labelling these data-sets takes significant time and money.
Guitar-Set, collaboratively created with researchers from the NYU Music and Audio Research Lab and the Queen Mary University of London Centre for Digital Music, helps ease the difficulty and expense of labeling musical data-sets by automating as much of the process as possible. The method involves recording guitar performances with both a microphone and a hexaphonic pickup, a device attached to the guitar which magnetically records each string. The researchers recorded 16-bar performances from acoustic and electric guitars.
Based on the hexaphonic recordings, Guitar-Set includes note-level annotations of string and fret position to recorded metadata about tempo, key, style, beat and downbeat, and chords. Guitar-Set’s objective is to create these note-level annotations from individually recorded strings, but the researchers found that a full automation of this process is challenging. They encountered a high rate of false positives on the hexaphonic recording due to picked up vibrations that were not audible on the microphone-recorded track. While this requires a manual comparison between the two types of recordings, Bello and collaborators were able to mitigate some of the false positives by first running the hexaphonic data through a bleed removal algorithm.
Despite the challenges, Guitar-Set will be a useful resource for researchers in the MIR community. Bello and collaborators also anticipate that the new method will aid other tasks such as source separation, understanding sympathetic resources in the acoustic guitar, and understanding guitar right-hand activity.
By Paul Oliver