Leimkühler, ThomasSeidel, Hans-PeterRitschel, TobiasZwicker, Matthias and Sander, Pedro2017-06-192017-06-1920171467-8659https://doi.org/10.1111/cgf.13219https://diglib.eg.org:443/handle/10.1111/cgf13219Observing that many visual effects (depth-of-field, motion blur, soft shadows, spectral effects) and several sampling modalities (time, stereo or light fields) can be expressed as a sum of many pinhole camera images, we suggest a novel efficient image synthesis framework that exploits coherency among those images. We introduce the notion of ''distribution flow'' that represents the 2D image deformation in response to changes in the high-dimensional time-, lens-, area light-, spectral-, etc. coordinates. Our approach plans the optimal traversal of the distribution space of all required pinhole images, such that starting from one representative root image, which is incrementally changed (warped) in a minimal fashion, pixels move at most by one pixel, if at all. The incremental warping allows extremely simple warping code, typically requiring half a millisecond on an Nvidia Geforce GTX 980Ti GPU per pinhole image. We show, how the bounded sampling does introduce very little errors in comparison to re-rendering or a common warping-based solution. Our approach allows efficient previews for arbitrary combinations of distribution effects and imaging modalities with little noise and high visual fidelity.Minimal Warping: Planning Incremental Novel-view Synthesis10.1111/cgf.13219001-014