Perceptually Guided Expressive Facial Animation

Loading...
Thumbnail Image
Date
2008
Journal Title
Journal ISSN
Volume Title
Publisher
The Eurographics Association
Abstract
Most of current facial animation approaches largely focus on the accuracy or efficiency of their algorithms, or how to optimally utilize pre-collected facial motion data. However, human perception, the ultimate measuring stick of the visual fidelity of synthetic facial animations, was not effectively exploited in these approaches. In this paper, we present a novel perceptually guided computational framework for expressive facial animation, by bridging objective facial motion patterns with subjective perceptual outcomes. First, we construct a facial perceptual metric (FacePEM) using a hybrid of region-based facial motion analysis and statistical learning techniques. The constructed FacePEM model can automatically measure the emotional expressiveness of a facial motion sequence. We showed how the constructed FacePEM model can be effectively incorporated into various facial animation algorithms. For the sake of clear demonstrations, we choose data-driven expressive speech animation generation and expressive facial motion editing as two concrete application examples. Through a comparative user study, we showed that comparing with the traditional facial animation algorithms, the introduced perceptually guided expressive facial animation algorithms can significantly increase the emotional expressiveness and perceptual believability of synthesized facial animations.
Description

        
@inproceedings{
:10.2312/SCA/SCA08/067-076
, booktitle = {
Eurographics/SIGGRAPH Symposium on Computer Animation
}, editor = {
Markus Gross and Doug James
}, title = {{
Perceptually Guided Expressive Facial Animation
}}, author = {
Deng, Zhigang
and
Ma, Xiaohan
}, year = {
2008
}, publisher = {
The Eurographics Association
}, ISSN = {
1727-5288
}, ISBN = {
978-3-905674-10-1
}, DOI = {
/10.2312/SCA/SCA08/067-076
} }
Citation