Hair Interpolation for Portrait Morphing

dc.contributor.authorWeng, Yanlinen_US
dc.contributor.authorWang, Lvdien_US
dc.contributor.authorLi, Xiaoen_US
dc.contributor.authorChai, Mengleien_US
dc.contributor.authorZhou, Kunen_US
dc.contributor.editorB. Levy, X. Tong, and K. Yinen_US
dc.date.accessioned2015-02-28T16:10:12Z
dc.date.available2015-02-28T16:10:12Z
dc.date.issued2013en_US
dc.description.abstractIn this paper we study the problem of hair interpolation: given two 3D hair models, we want to generate a sequence of intermediate hair models that transform from one input to another both smoothly and aesthetically pleasing. We propose an automatic method that efficiently calculates a many-to-many strand correspondence between two or more given hair models, taking into account the multi-scale clustering structure of hair. Experiments demonstrate that hair interpolation can be used for producing more vivid portrait morphing effects and enabling a novel example-based hair styling methodology, where a user can interactively create new hairstyles by continuously exploring a ''style space'' spanning multiple input hair models.en_US
dc.description.seriesinformationComputer Graphics Forumen_US
dc.identifier.issn1467-8659en_US
dc.identifier.urihttps://doi.org/10.1111/cgf.12214en_US
dc.publisherThe Eurographics Association and Blackwell Publishing Ltd.en_US
dc.titleHair Interpolation for Portrait Morphingen_US
Files
Collections