Tzeng, StanleyPatney, AnjulDavidson, AndrewEbeida, Mohamed S.Mitchell, Scott A.Owens, John D.Carsten Dachsbacher and Jacob Munkberg and Jacopo Pantaleoni2013-10-282013-10-282012978-3-905674-41-52079-8679http://dx.doi.org/10.2312/EGGH/HPG12/023-031We present a parallel method for rendering high-quality depth-of-field effects using continuous-domain line samples, and demonstrate its high performance on commodity GPUs. Our method runs at interactive rates and has very low noise. Our exploration of the problem carefully considers implementation alternatives, and transforms an originally unbounded storage requirement to a small fixed requirement using heuristics to maintain quality. We also propose a novel blur-dependent level-of-detail scheme that helps accelerate rendering without undesirable artifacts. Our method consistently runs 4 to 5x faster than an equivalent point sampler with better image quality. Our method draws parallels to related work in rendering multi-fragment effects.High-Quality Parallel Depth-of-Field Using Line Samples