Show simple item record

dc.contributor.authorPetit, Antoineen_US
dc.contributor.authorHaouchine, Nazimen_US
dc.contributor.authorRoy, Fredericken_US
dc.contributor.authorGoldman, Daniel B.en_US
dc.contributor.authorCotin, Stephaneen_US
dc.contributor.editorVidal, Franck P. and Tam, Gary K. L. and Roberts, Jonathan C.en_US
dc.date.accessioned2019-09-11T05:08:58Z
dc.date.available2019-09-11T05:08:58Z
dc.date.issued2019
dc.identifier.isbn978-3-03868-096-3
dc.identifier.urihttps://doi.org/10.2312/cgvc.20191255
dc.identifier.urihttps://diglib.eg.org:443/handle/10.2312/cgvc20191255
dc.description.abstractWe present Deformed Reality, a new way of interacting with an augmented reality environment by manipulating 3D objects in an intuitive and physically-consistent manner. Using the core principle of augmented reality to estimate rigid pose over time, our method makes it possible for the user to deform the targeted object while it is being rendered with its natural texture, giving the sense of a interactive scene editing. Our framework follows a computationally efficient pipeline that uses a proxy CAD model for both pose computation, physically-based manipulations and scene appearance estimation. The final composition is built upon a continuous image completion and re-texturing process to preserve visual consistency. The presented results show that our method can open new ways of using augmented reality by not only augmenting the environment but also interacting with objects intuitively.en_US
dc.publisherThe Eurographics Associationen_US
dc.titleDeformed Realityen_US
dc.description.seriesinformationComputer Graphics and Visual Computing (CGVC)
dc.description.sectionheadersVirtual Reality
dc.identifier.doi10.2312/cgvc.20191255
dc.identifier.pages27-34


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record