Show simple item record

dc.contributor.authorAndersson, Magnusen_US
dc.contributor.authorHasselgren, Jonen_US
dc.contributor.authorToth, Roberten_US
dc.contributor.authorAkenine-Möller, Tomasen_US
dc.contributor.editorB. Levy and J. Kautzen_US
dc.date.accessioned2015-03-03T12:29:11Z
dc.date.available2015-03-03T12:29:11Z
dc.date.issued2014en_US
dc.identifier.issn1467-8659en_US
dc.identifier.urihttp://dx.doi.org/10.1111/cgf.12303en_US
dc.description.abstractWhen rendering effects such as motion blur and defocus blur, shading can become very expensive if done in a naïve way, i.e. shading each visibility sample. To improve performance, previous work often decouple shading from visibility sampling using shader caching algorithms. We present a novel technique for reusing shading in a stochastic rasterizer. Shading is computed hierarchically and sparsely in an object-space texture, and by selecting an appropriate mipmap level for each triangle, we ensure that the shading rate is sufficiently high so that no noticeable blurring is introduced in the rendered image. Furthermore, with a two-pass algorithm, we separate shading from reuse and thus avoid GPU thread synchronization. Our method runs at real-time frame rates and is up to 3x faster than previous methods. This is an important step forward for stochastic rasterization in real time.en_US
dc.publisherThe Eurographics Association and John Wiley and Sons Ltd.en_US
dc.titleAdaptive Texture Space Shading for Stochastic Renderingen_US
dc.description.seriesinformationComputer Graphics Forumen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record