Data-Driven Biped Control |
Our data-driven controller allows the physically-simulated biped character to reproduce challenging motor skills captured in motion data. |
Personnel |
Abstract We present a dynamic controller to physically simulate under-actuated three-dimensional full-body biped locomotion. Our data-driven controller takes motion capture reference data to reproduce realistic human locomotion through realtime physically based simulation. The key idea is modulating the reference trajectory continuously and seamlessly such that even a simple dynamic tracking controller can follow the reference trajectory while maintaining its balance. In our framework, biped control can be facilitated by a large array of existing data-driven animation techniques because our controller can take a stream of reference data generated on-the-fly at runtime. We demonstrate the effectiveness of our approach through examples that allow bipeds to turn, spin, and walk while steering its direction interactively. |
Paper
Yoonsang Lee, Sungeun Kim, Jehee Lee, Data-Driven Biped Control, ACM Transactions on Graphics (SIGGRAPH 2010), Vol. 29, No. 4, Article 129, July 2010
|
Video
Full video : mov (60.2MB)
Spinning example :
|
Presentation SIGGRAPH 2010 talk slides : pptx (2.2MB, without video) / zip (132MB, with video) |
Data Reference motion capture data : zip (0.7MB) |