Taku Komura delivers key note speech at CVMP 2017

CVMP 2017 - The 14th European Conference on Visual Media Production

11-12th December 2017 - London

Title: Learning representations for character motion synthesis

Abstract: In this talk I will present about two different directions about character motion synthesis. I first briefly talk about our previous exploration of hand-crafted features for synthesis and retargeting of motions that involve close interactions. Here we make use of topological/geometric notions such as Gauss linking numbers and Laplacian coordinates as invariant criteria that are useful for guiding movements or preserving spatial relations between the body parts when retargeting movements. Next, I will talk about our recent work where we make use of neural networks to learn from a large amount of motion capture data and its application to character motion synthesis. By conducting temporal convolution, we construct an autoencoder of human motion that are useful for motion denoising and editing. We then further regress user inputs such as lines drawn on the ground or end effector trajectories to the full body motion via the autoencoder, which can significantly simplify the production process of character animation. I also introduce the phase-functioned neural network where we regress the geometry of the environment as well as the user inputs to the character motion for real-time character control, which can be applied for computer games and robot control. Finally, I will discuss about the potential interesting directions to explore for representation learning in human motion.

Bio: Taku Komura is a Reader (Associate Professor) at the Institute of Perception, Action and Behavior, School of Informatics, University of Edinburgh. As the leader of the Computer Animation and Robotics Group his research has focused on data-driven character animation, physically-based character animation, crowd simulation, cloth animation, anatomy-based modelling, and robotics. Recently, his main research interests have been the application of neural networks for animation synthesis. He is a recipient of the Royal Society Industry Fellowship (2014) and the Google AR/VR Research Award (2017).

His webpage can be viewed at the following link: http://homepages.inf.ed.ac.uk/tkomura/

Taku Komura