Learning-Based Animation of Clothing for Virtual Try-On

dc.contributor.authorSantesteban, Igoren_US
dc.contributor.authorOtaduy, Miguel A.en_US
dc.contributor.authorCasas, Danen_US
dc.contributor.editorAlliez, Pierre and Pellacini, Fabioen_US
dc.date.accessioned2019-05-05T17:41:25Z
dc.date.available2019-05-05T17:41:25Z
dc.date.issued2019
dc.description.abstractThis paper presents a learning-based clothing animation method for highly efficient virtual try-on simulation. Given a garment, we preprocess a rich database of physically-based dressed character simulations, for multiple body shapes and animations. Then, using this database, we train a learning-based model of cloth drape and wrinkles, as a function of body shape and dynamics. We propose a model that separates global garment fit, due to body shape, from local garment wrinkles, due to both pose dynamics and body shape. We use a recurrent neural network to regress garment wrinkles, and we achieve highly plausible nonlinear effects, in contrast to the blending artifacts suffered by previous methods. At runtime, dynamic virtual try-on animations are produced in just a few milliseconds for garments with thousands of triangles. We show qualitative and quantitative analysis of results.en_US
dc.description.number2
dc.description.sectionheadersLearning to Animate
dc.description.seriesinformationComputer Graphics Forum
dc.description.volume38
dc.identifier.doi10.1111/cgf.13643
dc.identifier.issn1467-8659
dc.identifier.pages355-366
dc.identifier.urihttps://doi.org/10.1111/cgf.13643
dc.identifier.urihttps://diglib.eg.org:443/handle/10.1111/cgf13643
dc.publisherThe Eurographics Association and John Wiley & Sons Ltd.en_US
dc.subjectComputing methodologies
dc.subjectPhysical simulation
dc.subjectNeural networks
dc.titleLearning-Based Animation of Clothing for Virtual Try-Onen_US
Files
Collections