Face/Off: Live Facial Puppetry

dc.contributor.authorWeise, Thibauten_US
dc.contributor.authorLi, Haoen_US
dc.contributor.authorGool, Luc Vanen_US
dc.contributor.authorPauly, Marken_US
dc.contributor.editorEitan Grinspun and Jessica Hodginsen_US
dc.date.accessioned2016-02-18T11:50:47Z
dc.date.available2016-02-18T11:50:47Z
dc.date.issued2009en_US
dc.description.abstractWe present a complete integrated system for live facial puppetry that enables high-resolution real-time facial expression tracking with transfer to another person's face. The system utilizes a real-time structured light scanner that provides dense 3D data and texture. A generic template mesh, fitted to a rigid reconstruction of the actor's face, is tracked offline in a training stage through a set of expression sequences. These sequences are used to build a person-specific linear face model that is subsequently used for online face tracking and expression transfer. Even with just a single rigid pose of the target face, convincing real-time facial animations are achievable. The actor becomes a puppeteer with complete and accurate control over a digital face.en_US
dc.description.sectionheadersLeveraging Motion Capture Dataen_US
dc.description.seriesinformationEurographics/ ACM SIGGRAPH Symposium on Computer Animationen_US
dc.identifier.doi10.1145/1599470.1599472en_US
dc.identifier.isbn978-1-60558-610-6en_US
dc.identifier.issn1727-5288en_US
dc.identifier.pages7-16en_US
dc.identifier.urihttps://doi.org/10.1145/1599470.1599472en_US
dc.publisherACM SIGGRAPH / Eurographics Associationen_US
dc.titleFace/Off: Live Facial Puppetryen_US
Files