Unger, J.Wenger, A.Hawkins, T.Gardner, A.Debevec, P.Philip Dutre and Frank Suykens and Per H. Christensen and Daniel Cohen-Or2014-01-272014-01-2720033-905673-03-71727-3463https://doi.org/10.2312/EGWR/EGWR03/141-149This paper presents a process for capturing spatially and directionally varying illumination from a real-world scene and using this lighting to illuminate computer-generated objects. We use two devices for capturing such illumination. In the first we photograph an array of mirrored spheres in high dynamic range to capture the spatially varying illumination. In the second, we obtain higher resolution data by capturing images with an high dynamic range omnidirectional camera as it traverses across a plane. For both methods we apply the light field technique to extrapolate the incident illumination to a volume. We render computer-generated objects as illuminated by this captured illumination using a custom shader within an existing global illumination rendering system. To demonstrate our technique we capture several spatially-varying lighting environments with spotlights, shadows, and dappled lighting and use them to illuminate synthetic scenes. We also show comparisons to real objects under the same illumination.Capturing and Rendering With Incident Light Fields