Jin, TaeilKim, MeekyungLee, Sung-HeeBernhard Thomaszewski and KangKang Yin and Rahul Narain2017-12-312017-12-312017978-1-4503-5091-4https://doi.org/10.1145/3099564.3106647https://diglib.eg.org:443/handle/10.1145/3099564-3106647Applying motion capture data for multi-person interaction to virtual characters is challenging because one needs to preserve interaction semantics in addition to satisfying the general requirements for motion retargeting, such as preventing penetration and preserving naturalness. An e cient method for representing the scene semantics of interaction motions is to de ne the spatial relationships between body parts of characters. However, existing methods of this kind consider only character skeleton, and thus may require post-processing to re ne the interaction motions and remove artifacts from the viewpoint of skin meshes. This paper proposes a novel method for retargeting interaction motions with respect to character skins. To this end, we introduce the aura mesh surrounding a character's skin in order to represent skin-level spatial relationships between body parts. Using the aura mesh, we can retarget interaction motions while preserving skin-level spatial relationships and reducing skin inter-penetrations.Computing methodologies Animationmotion retargetingspatial relationshipclose interactionMotion Retargeting to Preserve Spatial Relationship between Skinned Characters10.1145/3099564.3106647Taeil Jin, Meekyung Kim, and Sung-Hee Lee-Computing methodologies Animation; motion retargeting, spatial relationship, close interaction