Koskela, MatiasLotvonen, AtroMäkitalo, MarkkuKivi, PetrusViitanen, TimoJääskeläinen, PekkaBoubekeur, Tamy and Sen, Pradeep2019-07-142019-07-142019978-3-03868-095-61727-3463https://doi.org/10.2312/sr.20191219https://diglib.eg.org:443/handle/10.2312/sr20191219Computing power is still the limiting factor in photorealistic real-time rendering. Foveated rendering improves perceived quality by focusing the rendering effort on where the user is looking at. Applying foveated rendering to real-time path tracing where we must work on a very small number of samples per pixel introduces additional challenges; the rendering result is thoroughly noisy and sparse in the periphery. In this paper we demonstrate foveated real-time path tracing system and propose a novel Visual-Polar space in which both real-time path tracing and denoising is done before mapping to screen space. When path tracing a regular grid of samples in Visual-Polar space, the screen space sample distribution follows the human visual acuity model, making both the rendering and denoising 2:5x faster with similar perceived quality. In addition, when using Visual- Polar space, primary rays stay more coherent, leading to improved utilization of the GPU resources and, therefore, making ray traversal 1.3 - 1.5x faster. Moreover, Visual-Polar space improves 1 sample per pixel denoising quality in the fovea. We show that Visual-Polar based path tracing enables real-time rendering for contemporary virtual reality devices even without dedicated ray tracing hardware acceleration.Computing methodologiesPerceptionRay tracingVirtual realityFoveated Real-Time Path Tracing in Visual-Polar Space10.2312/sr.2019121939-50