Music-Driven Animation Generation of Expressive Musical Gestures

Publication date

DOI

Document Type

Master Thesis

Collections

Open Access logo

License

CC-BY-NC-ND

Abstract

While audio-driven face and gesture motion synthesis has been studied before, to our knowledge no research has been done yet for automatic generation of musical gestures for virtual humans. Existing work either focuses on precise 3D finger movement generation required to play an instrument or expressive musical gestures based on 2D video data. In this paper, we propose a music-driven piano performance generation method using 3D motion capture data and recurrent neural networks. Our results show that it is feasible to automatically generate expressive musical gestures for piano playing using various audio and musical features. However, it is not yet clear which features work best for which type of music. Our future work aims to further test with other datasets, deep learning methods and musical instruments using both objective and subjective evaluations.

Keywords

music-driven animation; audio-driven animation; virtual characters; musical gestures; neural networks; music-driven gestures; gesture animation; expressive gestures; LSTM

Citation