Mirbauer, MartinRittig, TobiasIser, TomášKřivánek, JaroslavŠikudová, ElenaAlliez, PierreWimmer, Michael2024-03-232024-03-2320241467-8659https://doi.org/10.1111/cgf.14990https://diglib.eg.org/handle/10.1111/cgf14990Achieving photorealism when rendering virtual scenes in movies or architecture visualizations often depends on providing a realistic illumination and background. Typically, spherical environment maps serve both as a natural light source from the Sun and the sky, and as a background with clouds and a horizon. In practice, the input is either a static high‐resolution HDR photograph manually captured on location in real conditions, or an analytical clear sky model that is dynamic, but cannot model clouds. Our approach bridges these two limited paradigms: a user can control the sun position and cloud coverage ratio, and generate a realistically looking environment map for these conditions. It is a hybrid data‐driven analytical model based on a modified state‐of‐the‐art GAN architecture, which is trained on matching pairs of physically‐accurate clear sky radiance and HDR fisheye photographs of clouds. We demonstrate our results on renders of outdoor scenes under varying time, date and cloud covers. Our source code and a dataset of 39 000 HDR sky images are publicly available at .Attribution-NonCommercial 4.0 Internationalmodelling; natural phenomenarendering; image‐based renderingrendering; atmospheric effectsSkyGAN: Realistic Cloud Imagery for Image‐based Lighting10.1111/cgf.1499012 pages