Data-Driven Gaze Animation using Recurrent Neural Networks

Publication date

DOI

Document Type

Master Thesis

Collections

Open Access logo

License

CC-BY-NC-ND

Abstract

We present a real-time gaze animation system using recurrent neural networks. Both motion capture data and video from a head-mounted camera are used to train the network to predict the motion of the body and the eyes. The system is trained separately on different poses, e.g. standing, sitting, and laying down, and is able to learn constraints on movement per pose. A simplified version of the neural network is presented, for scenarios which allow for lower detail gaze animation. We compare various neural network architectures and show that our method has the capability to learn realistic gaze motion from the data, while maintaining performance. Results from a user study conducted among game industry professionals, shows that our method significantly improves perceived naturalness of the gaze animation, compared to a manually created procedural gaze system.

Keywords

Motion Capture; Neural Networks; Animation; Recurrent Neural Networks; Games; Gaze Animation;

Citation