Shin, Hyun JoonLee, Yunjin2015-02-232015-02-2320091467-8659https://doi.org/10.1111/j.1467-8659.2009.01560.xIn this paper, we introduce a novel framework that allows users to synthesize the expression of a 3D character by providing a intuitive set of parametric controls. Assuming that human face movements are formed by a set of basis actuation, we analyze a set of real expressions to extract this set together with skin deformation due to the actuation of face. To do this, we first decompose the movement of each marker into a set of distinctive movements. Independent component analysis technique is then adopted to find a independent set of actuations. Our simple and efficient skin deformation model are learned to reproduce the realistic movements of facial parts due to the actuations. In this framework, users can animate characters faces by controlling the amount actuation or by directly manipulating the face geometry. In addition, the proposed method can be applied to expression transfer which reproduces one character s expression in another character s face. Experimental results demonstrate that our method can produce realistic expression efficiently.Expression Synthesis and Transfer in Parameter Spaces10.1111/j.1467-8659.2009.01560.x1829-1835