Poirier-Ginter, YohanGauthier, AlbanPhilip, JulienLalonde, Jean-FrançoisDrettakis, GeorgeGarces, ElenaHaines, Eric2024-06-252024-06-2520241467-8659https://doi.org/10.1111/cgf.15147https://diglib.eg.org/handle/10.1111/cgf15147Relighting radiance fields is severely underconstrained for multi-view data, which is most often captured under a single illumination condition; It is especially hard for full scenes containing multiple objects. We introduce a method to create relightable radiance fields using such single-illumination data by exploiting priors extracted from 2D image diffusion models. We first fine-tune a 2D diffusion model on a multi-illumination dataset conditioned by light direction, allowing us to augment a single-illumination capture into a realistic - but possibly inconsistent - multi-illumination dataset from directly defined light directions. We use this augmented data to create a relightable radiance field represented by 3D Gaussian splats. To allow direct control of light direction for low-frequency lighting, we represent appearance with a multi-layer perceptron parameterized on light direction. To enforce multi-view consistency and overcome inaccuracies we optimize a per-image auxiliary feature vector. We show results on synthetic and real multi-view data under single illumination, demonstrating that our method successfully exploits 2D diffusion model priors to allow realistic 3D relighting for complete scenes.Keywords: NeRF, Radiance Field, RelightingNeRFRadiance FieldRelightingA Diffusion Approach to Radiance Field Relighting using Multi-Illumination Synthesis10.1111/cgf.1514714 pages