Show simple item record

dc.contributor.authorWeng, Yanlinen_US
dc.contributor.authorWang, Lvdien_US
dc.contributor.authorLi, Xiaoen_US
dc.contributor.authorChai, Mengleien_US
dc.contributor.authorZhou, Kunen_US
dc.contributor.editorB. Levy, X. Tong, and K. Yinen_US
dc.description.abstractIn this paper we study the problem of hair interpolation: given two 3D hair models, we want to generate a sequence of intermediate hair models that transform from one input to another both smoothly and aesthetically pleasing. We propose an automatic method that efficiently calculates a many-to-many strand correspondence between two or more given hair models, taking into account the multi-scale clustering structure of hair. Experiments demonstrate that hair interpolation can be used for producing more vivid portrait morphing effects and enabling a novel example-based hair styling methodology, where a user can interactively create new hairstyles by continuously exploring a ''style space'' spanning multiple input hair models.en_US
dc.publisherThe Eurographics Association and Blackwell Publishing Ltd.en_US
dc.titleHair Interpolation for Portrait Morphingen_US
dc.description.seriesinformationComputer Graphics Forumen_US

Files in this item


This item appears in the following Collection(s)

  • 32-Issue 7
    Pacific Graphics 2013 - Special Issue

Show simple item record