Lin, ArvinLin, YimingLi, XiaohuiGhosh, AbhijeetHaines, EricGarces, Elena2024-06-252024-06-252024978-3-03868-262-21727-3463https://doi.org/10.2312/sr.20241150https://diglib.eg.org/handle/10.2312/sr20241150We present a method for high-quality image-based relighting using a practical limited zonal illumination field. Our setup can be implemented with commodity components with no dedicated hardware. We employ a set of desktop monitors to illuminate a subject from a near-hemispherical zone and record One-Light-At-A-Time (OLAT) images from multiple viewpoints. We further extrapolate sampling of incident illumination directions beyond the frontal coverage of the monitors by repeating OLAT captures with the subject rotation in relation to the capture setup. Finally, we train our proposed skip-assisted autoencoder and latent diffusion based generative method to learn a high-quality continuous representation of the reflectance function without requiring explicit alignment of the data captured from various viewpoints. This method enables smooth lighting animation for high-frequency reflectance functions and effectively manages to extend incident lighting beyond the practical capture setup's illumination zone. Compared to state-of-the-art methods, our approach achieves superior image-based relighting results, capturing finer skin pore details and extending to passive performance video relighting.Attribution 4.0 International LicenseCCS Concepts: Computing methodologies -> Reflectance modeling; Image-based rendering; Computational photographyCCS ConceptsComputing methodologies> Reflectance modelingImagebased renderingComputational photographyHigh Quality Neural Relighting using Practical Zonal Illumination10.2312/sr.2024115012 pages