Real-time Facial Animation from Live Video Tracking

Loading...
Thumbnail Image
Date
2011
Journal Title
Journal ISSN
Volume Title
Publisher
The Eurographics Association
Abstract
This paper describes a complete pipe-line of a practical system for producing real-time facial expressions of a 3D virtual avatar controlled by an actor's live performances. The system handles practical challenges arising from markerless expression captures from a single conventional video camera. For robust tracking, a localized algorithm constrained by belief propagation is applied to the upper face, and an appearance matching techniqueusing a parameterized generic face model is exploited for lower face and head pose tracking. The captured expression features then transferred to high dimensional 3D animation controls using our facial expression space which is a structure-preserving map between two algebraic structures. The transferred animation controls drive facial animation of a 3D avatar while optimizing the smoothness of the face mesh. An example-based face deformation technique produces non-linear local detail deformations on the avatar that are not captured in the movement of the animation controls.
Description

        
@inproceedings{
10.2312:SCA/SCA11/215-224
, booktitle = {
Eurographics/ ACM SIGGRAPH Symposium on Computer Animation
}, editor = {
A. Bargteil and M. van de Panne
}, title = {{
Real-time Facial Animation from Live Video Tracking
}}, author = {
Rhee, Taehyun
 and
Hwang, Youngkyoo
 and
Kim, James Dokyoon
 and
Kim, Changyeong
}, year = {
2011
}, publisher = {
The Eurographics Association
}, ISSN = {
1727-5288
}, ISBN = {
978-1-4503-0923-3
}, DOI = {
10.2312/SCA/SCA11/215-224
} }
Citation