Bergmann, StephanRitschel, TobiasDachsbacher, CarstenJan Bender and Arjan Kuijper and Tatiana von Landesberger and Holger Theisel and Philipp Urban2014-12-162014-12-162014978-3-905674-74-3https://doi.org/10.2312/vmv.20141269The availability of increasingly powerful and affordable image and depth sensors in conjunction with the necessary processing power creates novel possibilities for more sophisticated and powerful image editing tools. Along these lines we present a method to alter the appearance of objects in RGB-D images by re-shading their surfaces with arbitrary BRDF models and subsurface scattering using the dipole diffusion approximation. To evaluate the incident light for re-shading we combine ray marching using the depth buffer as approximate geometry and environment lighting. The environment map is built from information solely contained in the RGB-D input image exploiting both the reflections on glossy surfaces as well as geometric information. Our CPU/GPU implementation provides interactive feedback to facilitate intuitive editing.We compare and demonstrate our method with rendered images and digital photographs.I.3.7 [Computer Graphics]Three Dimensional Graphics and RealismRaytracing / Colorshadingshadowingand textureInteractive Appearance Editing in RGB-D Images