Kang, HongyuanDong, XiaoGuo, XufeiCao, JuanChen, ZhongguiChaine, RaphaƫlleDeng, ZhigangKim, Min H.2023-10-092023-10-092023978-3-03868-234-9https://doi.org/10.2312/pg.20231272https://diglib.eg.org:443/handle/10.2312/pg20231272Style transfer of images develops rapidly, however, only a few studies focus on geometric style transfer on 3D models. In this paper, we propose a style learning network to synthesize local geometric textures with similar styles on source mesh, driven by specific mesh or image features. Our network modifies a source mesh by predicting the displacement of vertices along the normal direction to generate geometric details. To constrain the style of the source mesh to be consistent with a specific style mesh, we define a style loss on 2D projected images of two meshes based on a differentiable renderer. We extract a set of global and local features from multiple views of 3D models via a pre-trained VGG network, driving the deformation of the source mesh based on the style loss. Our network is flexible in style learning as it can extract features from meshes and images to guide geometric deformation. Experiments verify the robustness of the proposed network and show the outperforming results of transferring multiple styles to the source mesh. We also conduct experiments to analyze the effectiveness of network design.Attribution 4.0 International LicenseCCS Concepts: Computing methodologies -> Shape analysisComputing methodologiesShape analysisA Style Transfer Network of Local Geometry for 3D Mesh Stylization10.2312/pg.2023127265-728 pages