A Pixel-Based Framework for Data-Driven Clothing

dc.contributor.authorJin, Ningen_US
dc.contributor.authorZhu, Yilinen_US
dc.contributor.authorGeng, Zhenglinen_US
dc.contributor.authorFedkiw, Ronen_US
dc.contributor.editorBender, Jan and Popa, Tiberiuen_US
dc.date.accessioned2020-10-16T06:25:34Z
dc.date.available2020-10-16T06:25:34Z
dc.date.issued2020
dc.description.abstractWe propose a novel approach to learning cloth deformation as a function of body pose, recasting the graph-like triangle mesh data structure into image-based data in order to leverage popular and well-developed convolutional neural networks (CNNs) in a two-dimensional Euclidean domain. Then, a three-dimensional animation of clothing is equivalent to a sequence of twodimensional RGB images driven/choreographed by time dependent joint angles. In order to reduce nonlinearity demands on the neural network, we utilize procedural skinning of the body surface to capture much of the rotation/deformation so that the RGB images only contain textures of displacement offsets from skin to clothing. Notably, we illustrate that our approach does not require accurate unclothed body shapes or robust skinning techniques. Additionally, we discuss how standard image based techniques such as image partitioning for higher resolution can readily be incorporated into our framework.en_US
dc.description.number8
dc.description.sectionheadersData-Driven Cloth
dc.description.seriesinformationComputer Graphics Forum
dc.description.volume39
dc.identifier.doi10.1111/cgf.14108
dc.identifier.issn1467-8659
dc.identifier.pages135-144
dc.identifier.urihttps://doi.org/10.1111/cgf.14108
dc.identifier.urihttps://diglib.eg.org:443/handle/10.1111/cgf14108
dc.publisherThe Eurographics Association and John Wiley & Sons Ltd.en_US
dc.subjectComputing methodologies
dc.subjectAnimation
dc.subjectNeural networks
dc.subjectComputer vision representations
dc.titleA Pixel-Based Framework for Data-Driven Clothingen_US
Files
Collections