Berkiten, SemaHalber, MaciejSolomon, JustinMa, ChongyangLi, HaoRusinkiewicz, SzymonLoic Barthe and Bedrich Benes2017-04-222017-04-2220171467-8659https://doi.org/10.1111/cgf.13132https://diglib.eg.org:443/handle/10.1111/cgf13132The visual richness of computer graphics applications is frequently limited by the difficulty of obtaining high-quality, detailed 3D models. This paper proposes a method for realistically transferring details (specifically, displacement maps) from existing high-quality 3D models to simple shapes that may be created with easy-to-learn modeling tools. Our key insight is to use metric learning to find a combination of geometric features that successfully predicts detail-map similarities on the source mesh; we use the learned feature combination to drive the detail transfer. The latter uses a variant of multi-resolution non-parametric texture synthesis, augmented by a high-frequency detail transfer step in texture space. We demonstrate that our technique can successfully transfer details among a variety of shapes including furniture and clothing.Learning Detail Transfer based on Geometric Features10.1111/cgf.13132361-373