Show simple item record

dc.contributor.authorDittebrandt, Addisen_US
dc.contributor.authorHanika, Johannesen_US
dc.contributor.authorDachsbacher, Carstenen_US
dc.contributor.editorDachsbacher, Carsten and Pharr, Matten_US
dc.description.abstractGood importance sampling is crucial for real-time path tracing where only low sample budgets are possible. We present two efficient sampling techniques tailored for massively-parallel GPU path tracing which improve next event estimation (NEE) for rendering with many light sources and sampling of indirect illumination. As sampling densities need to vary spatially, we use an octree structure in world space and introduce algorithms to continuously adapt the partitioning and distribution of the sampling budget. Both sampling techniques exploit temporal coherence by reusing samples from the previous frame: For NEE we collect sampled, unoccluded light sources and show how to deduplicate, but also diffuse this information to efficiently sample light sources in the subsequent frame. For sampling indirect illumination, we present a compressed directional quadtree structure which is iteratively adapted towards high-energy directions using samples from the previous frame. The updates and rebuilding of all data structures takes about 1ms in our test scenes, and adds about 6ms at 1080p to the path tracing time compared to using state-of-the-art light hierarchies and BRDF sampling. We show that this additional effort reduces noise in terms of mean squared error by at least one order of magnitude in many situations.en_US
dc.publisherThe Eurographics Associationen_US
dc.rightsAttribution 4.0 International License
dc.subjectComputing methodologies
dc.subjectRay tracing
dc.titleTemporal Sample Reuse for Next Event Estimation and Path Guiding for Real-Time Path Tracingen_US
dc.description.seriesinformationEurographics Symposium on Rendering - DL-only Track
dc.description.sectionheadersPath Guiding

Files in this item


This item appears in the following Collection(s)

Show simple item record

Attribution 4.0 International License
Except where otherwise noted, this item's license is described as Attribution 4.0 International License