Weise, ThibautLi, HaoGool, Luc VanPauly, MarkEitan Grinspun and Jessica Hodgins2016-02-182016-02-182009978-1-60558-610-61727-5288https://doi.org/10.1145/1599470.1599472We present a complete integrated system for live facial puppetry that enables high-resolution real-time facial expression tracking with transfer to another person's face. The system utilizes a real-time structured light scanner that provides dense 3D data and texture. A generic template mesh, fitted to a rigid reconstruction of the actor's face, is tracked offline in a training stage through a set of expression sequences. These sequences are used to build a person-specific linear face model that is subsequently used for online face tracking and expression transfer. Even with just a single rigid pose of the target face, convincing real-time facial animations are achievable. The actor becomes a puppeteer with complete and accurate control over a digital face.Face/Off: Live Facial Puppetry10.1145/1599470.15994727-16