Kim, SangBinPark, InbumKwon, SeongsuHan, JungHyunPanozzo, Daniele and Assarsson, Ulf2020-05-242020-05-2420201467-8659https://doi.org/10.1111/cgf.13947https://diglib.eg.org:443/handle/10.1111/cgf13947Motion retargetting refers to the process of adapting the motion of a source character to a target. This paper presents a motion retargetting model based on temporal dilated convolutions. In an unsupervised manner, the model generates realistic motions for various humanoid characters. The retargetted motions not only preserve the high-frequency detail of the input motions but also produce natural and stable trajectories despite the skeleton size differences between the source and target. Extensive experiments are made using a 3D character motion dataset and a motion capture dataset. Both qualitative and quantitative comparisons against prior methods demonstrate the effectiveness and robustness of our method.Attribution 4.0 International LicenseComputing methodologiesNeural networksMotion Retargetting based on Dilated Convolutions and Skeleton-specific Loss Functions10.1111/cgf.13947497-507