Kim, JungeonKim, HyominPark, JaesikLee, SeungyongLee, Jehee and Theobalt, Christian and Wetzstein, Gordon2019-10-142019-10-1420191467-8659https://doi.org/10.1111/cgf.13872https://diglib.eg.org:443/handle/10.1111/cgf13872We propose a novel framework to generate a global texture atlas for a deforming geometry. Our approach distinguishes from prior arts in two aspects. First, instead of generating a texture map for each timestamp to color a dynamic scene, our framework reconstructs a global texture atlas that can be consistently mapped to a deforming object. Second, our approach is based on a single RGB-D camera, without the need of a multiple-camera setup surrounding a scene. In our framework, the input is a 3D template model with an RGB-D image sequence, and geometric warping fields are found using a state-of-the-art non-rigid registration method [GXW*15] to align the template mesh to noisy and incomplete input depth images. With these warping fields, our multi-scale approach for texture coordinate optimization generates a sharp and clear texture atlas that is consistent with multiple color observations over time. Our approach is accelerated by graphical hardware and provides a handy configuration to capture a dynamic geometry along with a clean texture atlas. We demonstrate our approach with practical scenarios, particularly human performance capture. We also show that our approach is resilient on misalignment issues caused by imperfect estimation of warping fields and inaccurate camera parameters.Computing methodologiesShape modelingGlobal Texture Mapping for Dynamic Objects10.1111/cgf.13872697-705