Zhang, YuSim, TerenceTan, Chew Lim2015-11-122015-11-1220031017-4656https://doi.org/10.2312/egs.20031051We present an efficient method for the construction of an animatable 3D facial model of a specific person with minimal user interaction. The method is based on adapting an anatomy-based prototype facial model that is suitable for physically-based facial animation to the geometry of a real persons face recovered from laser-scanned range data. Starting with specification of a set of anthropometric landmarks on the 2D images, we automatically recover the 3D positions of the landmark points on the facial surface. A global shape adaptation is then carried out to align the prototype model to the target geometry using the transformation parameters estimated from measurements between recovered 3D landmark points. A local shape adaptation follows to deform the prototype model for fitting all of its vertices to the scanned surface data. The reconstructed 3D face portrays the geometry and color of the individual face and can be animated immediately with the given muscle parameters.Reconstruction of Animatable Personalized 3D Faces by Adaptation-based Modeling