Login


TitleAutomating Expressive Locomotion Generation (Article)
inTransactions on Edutainment VII
Author(s) Yejin Kim, Michael Neff
Keyword(s)Character animation, Locomotion style, Motion transition, Motion path, Motion capture data
Year 2012
LocationChengdu, China
DateMay, 2011
VolumeLNCS 7145
PublisherSpringer
Pages48--61
Download
BibTeX
Abstract This paper introduces a system for expressive locomotion generation that takes as input a set of sample locomotion clips and a motion path. Significantly, the system only requires a single sample of straight-path locomotion for each style modeled and can produce output locomotion for an arbitrary path with arbitrary motion transition points. For efficient locomotion generation, we represent each sample with a loop sequence which encapsulates its key style and utilize these sequences throughout the synthesis process. Several techniques are applied to automate the synthesis: foot-plant detection from unlabeled samples, estimation of an adaptive blending length for a natural style change, and a post-processing step for enhancing the physical realism of the output animation. Compared to previous approaches, the system requires significantly less data and manual labor, while supporting a large range of styles.