Meder, JulianBrĂ¼derlin, BeatJakob, Wenzel and Hachisuka, Toshiya2018-07-012018-07-012018978-3-03868-068-01727-3463https://doi.org/10.2312/sre.20181177https://diglib.eg.org:443/handle/10.2312/sre20181177The Screen Space Approximate Gaussian Hull method presented in this paper is based on an output sensitive, adaptive approach, which addresses the challenge of high quality rendering even for high resolution displays and large numbers of light sources or indirect lighting. Our approach uses dynamically sparse sampling of the light information on a low-resolution mesh approximated from screen space and applying these samples in a deferred shading stage to the full resolution image. This preserves geometric detail unlike common approaches using lower resolution rendering combined with upsampling strategies. The light samples are expressed by spherical Gaussian distribution functions, for which we found a more precise closed form integration compared to existing approaches. Thus, our method does not exhibit the quality degradation shown by previously proposed approaches and we show that the implementation is very efficient. Moreover, being an output sensitive approach, it can be used for massive scene rendering without additional cost.Computing methodologiesRasterizationReflectance modelingVirtual realityImage processingScreen Space Approximate Gaussian Hulls10.2312/sre.20181177107-115