Kim, Young MinRyu, SangwooKim, Ig-JaeCignoni, Paolo and Miguel, Eder2019-05-052019-05-0520191017-4656https://doi.org/10.2312/egs.20191020https://diglib.eg.org:443/handle/10.2312/egs20191020A large-scale scanned 3D environment suffers from complex occlusions and misalignment errors. The reconstruction contains holes in geometry and ghosting in texture. These are easily noticed and cannot be used in visually compelling VR content without further processing. On the other hand, the well-known Manhattan World priors successfully recreate relatively simple or clean structures. In this paper, we would like to push the limit of planar representation in indoor environments. We use planes not only to represent the environment geometrically but also to solve an inverse rendering problem considering texture and light. The complex process of shape inference and intrinsic imaging is greatly simplified with the help of detected planes and yet produces a realistic 3D indoor environment. The produced content can effectively represent the spatial arrangements for various AR/VR applications and can be readily combined with virtual objects possessing plausible lighting and texture.Computing methodologiesTexturingMixed / augmented realityReflectance modelingPlanar Abstraction and Inverse Rendering of 3D Indoor Environment10.2312/egs.2019102081-84