Intuitive Generation of Realistic Motions for Articulated Human Characters

Date

2013-01-15

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

A long-standing goal in computer graphics is to create and control realistic motion for virtual human characters. Despite the progress made over the last decade, it remains challenging to design a system that allows a random user to intuitively create and control life-like human motions. This dissertation focuses on exploring theory, algorithms and applications that enable novice users to quickly and easily create and control natural-looking motions, including both full-body movement and hand articulations, for human characters.

More specifically, the goals of this research are: (1) to investigate generative statistical models and physics-based dynamic models to precisely predict how humans move and (2) to demonstrate the utility of our motion models in a wide range of applications including motion analysis, synthesis, editing and acquisition.

We have developed two novel generative statistical models from prerecorded motion data and show their promising applications in real time motion editing, online motion control, offline animation design, and motion data processing. In addition, we have explored how to model subtle contact phenomena for dexterous hand grasping and manipulation using physics-based dynamic models. We show for the first time how to capture physically realistic hand manipulation data from ambiguous image data obtained by video cameras.

Description

Citation