Mirbauer, MartinRittig, TobiasIser, TomášKrivánek, JaroslavŠikudová, ElenaGhosh, AbhijeetWei, Li-Yi2022-07-012022-07-012022978-3-03868-187-81727-3463https://doi.org/10.2312/sr.20221151https://diglib.eg.org:443/handle/10.2312/sr20221151Achieving photorealism when rendering virtual scenes in movies or architecture visualizations often depends on providing a realistic illumination and background. Typically, spherical environment maps serve both as a natural light source from the Sun and the sky, and as a background with clouds and a horizon. In practice, the input is either a static high-resolution HDR photograph manually captured on location in real conditions, or an analytical clear sky model that is dynamic, but cannot model clouds. Our approach bridges these two limited paradigms: a user can control the sun position and cloud coverage ratio, and generate a realistically looking environment map for these conditions. It is a hybrid data-driven analytical model based on a modified state-of-the-art GAN architecture, which is trained on matching pairs of physically-accurate clear sky radiance and HDR fisheye photographs of clouds. We demonstrate our results on renders of outdoor scenes under varying time, date, and cloud covers.Attribution 4.0 International LicenseCCS Concepts: Computing methodologies --> Rendering; Supervised learning; Applied computing --> Earth and atmospheric sciencesComputing methodologiesRenderingSupervised learningApplied computingEarth and atmospheric sciencesSkyGAN: Towards Realistic Cloud Imagery for Image Based Lighting10.2312/sr.2022115113-2210 pages