Show simple item record

dc.contributor.authorZirr, Tobiasen_US
dc.contributor.authorHanika, Johannesen_US
dc.contributor.authorDachsbacher, Carstenen_US
dc.contributor.editorChen, Min and Benes, Bedrichen_US
dc.date.accessioned2018-09-19T15:32:53Z
dc.date.available2018-09-19T15:32:53Z
dc.date.issued2018
dc.identifier.issn1467-8659
dc.identifier.urihttps://diglib.eg.org:443/handle/10.1111/cgf13335
dc.identifier.urihttps://doi.org/10.1111/cgf.13335
dc.description.abstractSamples with high contribution but low probability density, often called fireflies, occur in all practical Monte Carlo estimators and are part of computing unbiased estimates. For finite‐sample estimates, however, they can lead to excessive variance. Rejecting all samples classified as outliers, as suggested in previous work, leads to estimates that are too low and can cause undesirable artefacts. In this paper, we show how samples can be re‐weighted depending on their contribution and sampling frequency such that the finite‐sample estimate gets closer to the correct expected value and the variance can be controlled. For this, we first derive a theory for how samples should ideally be re‐weighted and that this would require the probability density function of the optimal sampling strategy. As this probability density function is generally unknown, we show how the discrepancy between the optimal and the actual sampling strategy can be estimated and used for re‐weighting in practice. We describe an efficient algorithm that allows for the necessary analysis of per‐pixel sample distributions in the context of Monte Carlo rendering without storing any individual samples, with only minimal changes to the rendering algorithm. It causes negligible runtime overhead, works in constant memory and is well suited for parallel and progressive rendering. The re‐weighting runs as a fast post‐process, can be controlled interactively and our approach is non‐destructive in that the unbiased result can be reconstructed at any time.Samples with high contribution but low probability density, often called fireflies, occur in all practical Monte Carlo estimators and are part of computing unbiased estimates. For finite‐sample estimates, however, they can lead to excessive variance. Rejecting all samples classified as outliers, as suggested in previous work, leads to estimates that are too low and can cause undesirable artefacts. In this paper, we show how samples can be re‐weighted depending on their contribution and sampling frequency such that the finite‐sample estimate gets closer to the correct expected value and the variance can be controlled. For this, we first derive a theory for how samples should ideally be re‐weighted and that this would require the probability density function of the optimal sampling strategy. As this probability density function is generally unknown, we show how the discrepancy between the optimal and the actual sampling strategy can be estimated and used for re‐weighting in practice. We describe an efficient algorithm that allows for the necessary analysis of per‐pixel sample distributions in the context of Monte Carlo rendering without storing any individual samples, with only minimal changes to the rendering algorithm.en_US
dc.publisher© 2018 The Eurographics Association and John Wiley & Sons Ltd.en_US
dc.subjectMonte Carlo techniques
dc.subjectmethods and applications
dc.subjectglobal illumination
dc.subjectrendering
dc.subjectprogressive rendering
dc.subjectComputing methodologies → Ray tracing
dc.titleRe‐Weighting Firefly Samples for Improved Finite‐Sample Monte Carlo Estimatesen_US
dc.description.seriesinformationComputer Graphics Forum
dc.description.sectionheadersArticles
dc.description.volume37
dc.description.number6
dc.identifier.doi10.1111/cgf.13335
dc.identifier.pages410-421


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record