P4-01: Performance MIDI-to-score conversion by neural beat tracking
Liu, Lele*, Kong, Qiuqiang, Morfi, Veronica, Benetos, Emmanouil
Subjects (starting with primary): Domain knowledge -> machine learning/artificial intelligence for music ; MIR tasks -> music transcription and annotation ; Musical features and properties -> rhythm, beat, tempo ; MIR fundamentals and methodology -> symbolic music processing
Presented In-person, in Bengaluru: 10-minute long-format presentation
Rhythm quantisation is an essential part of converting performance MIDI recordings into musical scores. Previous works on rhythm quantisation are limited to the use of probabilistic or statistical methods. In this paper, we propose a MIDI-to-score quantisation method using a convolutional-recurrent neural network (CRNN) trained on MIDI note sequences to predict whether notes are on beats. Then, we expand the CRNN model to predict the quantised times for all beat and non-beat notes. Furthermore, we enable the model to predict the key signatures, time signatures, and hand parts of all notes. Our proposed performance MIDI-to-score system achieves significantly better performance compared to commercial software evaluated on the MV2H metric. We release the toolbox for converting performance MIDI into MIDI scores at: https://github.com/cheriell/PM2S .