Hladky, JozefSeidel, Hans-PeterSteinberger, MarkusBoubekeur, Tamy and Sen, Pradeep2019-07-142019-07-1420191467-8659https://doi.org/10.1111/cgf.13780https://diglib.eg.org:443/handle/10.1111/cgf13780Presenting high-fidelity 3D content on compact portable devices with low computational power is challenging. Smartphones, tablets and head-mounted displays (HMDs) suffer from thermal and battery-life constraints and thus cannot match the render quality of desktop PCs and laptops. Streaming rendering enables to show high-quality content but can suffer from potentially high latency. We propose an approach to efficiently capture shading samples in object space and packing them into a texture. Streaming this texture to the client, we support temporal frame up-sampling with high fidelity, low latency and high mobility. We introduce two novel sample distribution strategies and a novel triangle representation in the shading atlas space. Since such a system requires dynamic parallelism, we propose an implementation exploiting the power of hardware-accelerated tessellation stages. Our approach allows fast de-coding and rendering of extrapolated views on a client device by using hardwareaccelerated interpolation between shading samples and a set of potentially visible geometry. A comparison to existing shading methods shows that our sample distributions allow better client shading quality than previous atlas streaming approaches and outperforms image-based methods in all relevant aspects.Computing methodologiesRenderingTexturingVirtual realityImagebased renderingTessellated Shading Streaming10.1111/cgf.13780171-182