Real-Time Translucent Rendering Using GPU-based Texture Space Importance Sampling

dc.contributor.authorChang, Chih-Wenen_US
dc.contributor.authorLin, Wen-Chiehen_US
dc.contributor.authorHo, Tan-Chien_US
dc.contributor.authorHuang, Tsung-Shianen_US
dc.contributor.authorChuang, Jung-Hongen_US
dc.date.accessioned2015-02-21T16:19:16Z
dc.date.available2015-02-21T16:19:16Z
dc.date.issued2008en_US
dc.description.abstractWe present a novel approach for real-time rendering of translucent surfaces. The computation of subsurface scattering is performed by first converting the integration over the 3D model surface into an integration over a 2D texture space and then applying importance sampling based on the irradiance stored in the texture. Such a conversion leads to a feasible GPU implementation and makes real-time frame rate possible. Our implementation shows that plausible images can be rendered in real time for complex translucent models with dynamic light and material properties. For objects with more apparent local effect, our approach generally requires more samples that may downgrade the frame rate. To deal with this case, we decompose the integration into two parts, one for local effect and the other for global effect, which are evaluated by the combination of available methods [DS03, MKB* 03a] and our texture space importance sampling, respectively. Such a hybrid scheme is able to steadily render the translucent effect in real time with a fixed amount of samples.en_US
dc.description.number2en_US
dc.description.seriesinformationComputer Graphics Forumen_US
dc.description.volume27en_US
dc.identifier.doi10.1111/j.1467-8659.2008.01149.xen_US
dc.identifier.issn1467-8659en_US
dc.identifier.pages517-526en_US
dc.identifier.urihttps://doi.org/10.1111/j.1467-8659.2008.01149.xen_US
dc.publisherThe Eurographics Association and Blackwell Publishing Ltden_US
dc.titleReal-Time Translucent Rendering Using GPU-based Texture Space Importance Samplingen_US
Files