Show simple item record

dc.contributor.authorKettern, Markusen_US
dc.contributor.authorHilsmann, Annaen_US
dc.contributor.authorEisert, Peteren_US
dc.contributor.editorDavid Bommes and Tobias Ritschel and Thomas Schultzen_US
dc.description.abstractIn this paper, we present a method for detailed temporally consistent facial performance capture that supports any number of arbitrarily placed video cameras. Using a suitable 3D model as reference geometry, our method tracks facial movement and deformation as well as photometric changes due to illumination and shadows. In an analysis-by-synthesis framework, we warp one single reference image per camera to all frames of the sequence thereby drastically reducing temporal drift which is a serious problem for many state-of-the-art approaches. Temporal appearance variations are handled by a photometric estimation component modeling local intensity changes between the reference image and each individual frame. All parameters of the problem are estimated jointly so that we do not require separate estimation steps that might interfere with one another.en_US
dc.publisherThe Eurographics Associationen_US
dc.titleTemporally Consistent Wide Baseline Facial Performance Capture via Image Warpingen_US
dc.description.seriesinformationVision, Modeling & Visualizationen_US
dc.description.sectionheadersImages and Videoen_US

Files in this item


This item appears in the following Collection(s)

  • VMV15
    ISBN 978-3-905674-95-8

Show simple item record