Santesteban, IgorOtaduy, Miguel A.Casas, DanAlliez, Pierre and Pellacini, Fabio2019-05-052019-05-0520191467-8659https://doi.org/10.1111/cgf.13643https://diglib.eg.org:443/handle/10.1111/cgf13643This paper presents a learning-based clothing animation method for highly efficient virtual try-on simulation. Given a garment, we preprocess a rich database of physically-based dressed character simulations, for multiple body shapes and animations. Then, using this database, we train a learning-based model of cloth drape and wrinkles, as a function of body shape and dynamics. We propose a model that separates global garment fit, due to body shape, from local garment wrinkles, due to both pose dynamics and body shape. We use a recurrent neural network to regress garment wrinkles, and we achieve highly plausible nonlinear effects, in contrast to the blending artifacts suffered by previous methods. At runtime, dynamic virtual try-on animations are produced in just a few milliseconds for garments with thousands of triangles. We show qualitative and quantitative analysis of results.Computing methodologiesPhysical simulationNeural networksLearning-Based Animation of Clothing for Virtual Try-On10.1111/cgf.13643355-366