Texture Synthesis From Photographs

dc.contributor.authorEisenacher, C.en_US
dc.contributor.authorLefebvre, S.en_US
dc.contributor.authorStamminger, M.en_US
dc.date.accessioned2015-02-21T16:19:00Z
dc.date.available2015-02-21T16:19:00Z
dc.date.issued2008en_US
dc.description.abstractThe goal of texture synthesis is to generate an arbitrarily large high-quality texture from a small input sample. Generally, it is assumed that the input image is given as a flat, square piece of texture, thus it has to be carefully prepared from a picture taken under ideal conditions. Instead we would like to extract the input texture from any surface from within an arbitrary photograph. This introduces several challenges: Only parts of the photograph are covered with the texture of interest, perspective and scene geometry introduce distortions, and the texture is non-uniformly sampled during the capture process. This breaks many of the assumptions used for synthesis.In this paper we combine a simple novel user interface with a generic per-pixel synthesis algorithm to achieve high-quality synthesis from a photograph. Our interface lets the user locally describe the geometry supporting the textures by combining rational Bezier patches. These are particularly well suited to describe curved surfaces under projection. Further, we extend per-pixel synthesis to account for arbitrary texture sparsity and distortion, both in the input image and in the synthesis output. Applications range from synthesizing textures directly from photographs to high-quality texture completion.en_US
dc.description.number2en_US
dc.description.seriesinformationComputer Graphics Forumen_US
dc.description.volume27en_US
dc.identifier.doi10.1111/j.1467-8659.2008.01139.xen_US
dc.identifier.issn1467-8659en_US
dc.identifier.pages419-428en_US
dc.identifier.urihttps://doi.org/10.1111/j.1467-8659.2008.01139.xen_US
dc.publisherThe Eurographics Association and Blackwell Publishing Ltden_US
dc.titleTexture Synthesis From Photographsen_US
Files