Advanced Methods for Relightable Scene Representations in Image Space
The realistic reproduction of visual appearance of real-world objectsrequires accurate computer graphics models that describe the opticalinteraction of a scene with its surroundings. Data-driven approachesthat model the scene globally as a reflectance field function in eightparameters deliver high quality and work for most material combinations,but are costly to acquire and store. Image-space relighting, whichconstrains the application to create photos with a virtual, fix camerain freely chosen illumination, requires only a 4D data structure toprovide full fidelity.This thesis contributes to image-space relighting on four accounts: (1)We investigate the acquisition of 4D reflectance fields in the contextof sampling and propose a practical setup for pre-filtering ofreflectance data during recording, and apply it in an adaptive samplingscheme. (2) We introduce a feature-driven image synthesis algorithm forthe interpolation of coarsely sampled reflectance data in software toachieve highly realistic images. (3) We propose an implicit reflectancedata representation, which uses a Bayesian approach to relight complexscenes from the example of much simpler reference objects. (4) Finally,we construct novel, passive devices out of optical components thatrender reflectance field data in real-time, shaping the incidentillumination into the desired image.