Xiang, GuofuJu, XiangyangHolt, Patrik O'B.Shang, LinWen Tang and John Collomosse2014-01-312014-01-312009978-3-905673-71-5https://doi.org/10.2312/LocalChapterEvents/TPCG/TPCG09/117-124This paper presents an automated approach to transferring facial expressions from a generic facial model onto various individual facial models without requiring any prior correspondences and manual interventions during the transferring process. This approach automatically detects the corresponding feature landmarks between models, and establishes the dense correspondences by means of an elastic energy-based deformable modelling approach. The deformed model, obtained through the deformation process, maintains the same topology as the generic model and the same shape as the individual one. After establishing the dense correspondences, we first transfer the facial expressions onto the deformed model by a deformation transfer technique, and then obtain the final expression models of individual models by interpolating the expression displacements on the deformed model. The results show that our approach is able to produce convincing results on landmark detection, correspondence establishment and expression transferring.Categories and Subject Descriptors (according to ACM CCS): I.3.5 [Computer Graphics]: Computational Geometry and Object Modeling - Geometry Algorithms; I.3.7 [Computer Graphics]: Three-Dimensional Graphics and Realism - AnimationFacial Expression Transferring with a Deformable Model