Repetition Maximization based Texture Rectification

dc.contributor.authorAiger, Droren_US
dc.contributor.authorCohen-Or, Danielen_US
dc.contributor.authorMitra, Niloy J.en_US
dc.contributor.editorP. Cignoni and T. Ertlen_US
dc.date.accessioned2015-02-28T06:54:04Z
dc.date.available2015-02-28T06:54:04Z
dc.date.issued2012en_US
dc.description.abstractMany photographs are taken in perspective. Techniques for rectifying resulting perspective distortions typically rely on the existence of parallel lines in the scene. In scenarios where such parallel lines are hard to automatically extract or manually annotate, the unwarping process remains a challenge. In this paper, we introduce an automatic algorithm to rectifying images containing textures of repeated elements lying on an unknown plane. We unwrap the input by maximizing for image self-similarity over the space of homography transformations. We map a set of detected regional descriptors to surfaces in a transformation space, compute the intersection points among triplets of such surfaces, and then use consensus among the projected intersection points to extract the correcting transform. Our algorithm is global, robust, and does not require explicit or accurate detection of similar elements. We evaluate our method on a variety of challenging textures and images. The rectified outputs are directly useful for various tasks including texture synthesis, image completion, etc.en_US
dc.description.seriesinformationComputer Graphics Forumen_US
dc.description.volume31
dc.identifier.doi10.1111/j.1467-8659.2012.03023.x
dc.identifier.issn1467-8659en_US
dc.identifier.urihttps://doi.org/10.1111/j.1467-8659.2012.03023.xen_US
dc.publisherThe Eurographics Association and John Wiley and Sons Ltd.en_US
dc.titleRepetition Maximization based Texture Rectificationen_US
Files