Search Results

Now showing 1 - 5 of 5
  • Item
    Real-time Level-of-detail Strand-based Rendering
    (The Eurographics Association and John Wiley & Sons Ltd., 2025) Huang, Tao; Zhou, Yang; Lin, Daqi; Zhu, Junqiu; Yan, Ling-Qi; Wu, Kui; Wang, Beibei; Wilkie, Alexander
    We present a real-time strand-based rendering framework that ensures seamless transitions between different level-of-detail (LoD) while maintaining a consistent appearance. We first introduce an aggregated BCSDF model to accurately capture both single and multiple scattering within the cluster for hairs and fibers. Building upon this, we further introduce a LoD framework for hair rendering that dynamically, adaptively, and independently replaces clusters of individual hairs with thick strands based on their projected screen widths. Through tests on diverse hairstyles with various hair colors and animation, as well as knit patches, our framework closely replicates the appearance of multiple-scattered full geometries at various viewing distances, achieving up to a 13× speedup.
  • Item
    SPaGS: Fast and Accurate 3D Gaussian Splatting for Spherical Panoramas
    (The Eurographics Association and John Wiley & Sons Ltd., 2025) Li, Junbo; Hahlbohm, Florian; Scholz, Timon; Eisemann, Martin; Tauscher, Jan-Philipp; Magnor, Marcus; Wang, Beibei; Wilkie, Alexander
    In this paper we propose SPaGS, a high-quality, real-time free-viewpoint rendering approach from 360-degree panoramic images. While existing methods building on Neural Radiance Fields or 3D Gaussian Splatting have difficulties to achieve real-time frame rates and high-quality results at the same time, SPaGS combines the advantages of an explicit 3D Gaussian-based scene representation and ray casting-based rendering to attain fast and accurate results. Central to our new approach is the exact calculation of axis-aligned bounding boxes for spherical images that significantly accelerates omnidirectional ray casting of 3D Gaussians. We also present a new dataset consisting of ten real-world scenes recorded with a drone that incorporates both calibrated 360-degree panoramic images as well as perspective images captured simultaneously, i.e., with the same flight trajectory. Our evaluation on this new dataset as well as established benchmarks demonstrates that SPaGS excels over state-of-the-art methods in terms of both rendering quality and speed.
  • Item
    Real-time Image-based Lighting of Glints
    (The Eurographics Association and John Wiley & Sons Ltd., 2025) Kneiphof, Tom; Klein, Reinhard; Wang, Beibei; Wilkie, Alexander
    Image-based lighting is a widely used technique to reproduce shading under real-world lighting conditions, especially in realtime rendering applications. A particularly challenging scenario involves materials exhibiting a sparkling or glittering appearance, caused by discrete microfacets scattered across their surface. In this paper, we propose an efficient approximation for image-based lighting of glints, enabling fully dynamic material properties and environment maps. Our novel approach is grounded in real-time glint rendering under area light illumination and employs standard environment map filtering techniques. Crucially, our environment map filtering process is sufficiently fast to be executed on a per-frame basis. Our method assumes that the environment map is partitioned into few homogeneous regions of constant radiance. By filtering the corresponding indicator functions with the normal distribution function, we obtain the probabilities for individual microfacets to reflect light from each region. During shading, these probabilities are utilized to hierarchically sample a multinomial distribution, facilitated by our novel dual-gated Gaussian approximation of binomial distributions. We validate that our real-time approximation is close to ground-truth renderings for a range of material properties and lighting conditions, and demonstrate robust and stable performance, with little overhead over rendering glints from a single directional light. Compared to rendering smooth materials without glints, our approach requires twice as much memory to store the prefiltered environment map.
  • Item
    Reshadable Impostors with Level-of-Detail for Real-Time Distant Objects Rendering
    (The Eurographics Association and John Wiley & Sons Ltd., 2025) Wu, Xiaoloong; Zeng, Zheng; Zhu, Junqiu; Wang, Lu; Wang, Beibei; Wilkie, Alexander
    We propose a new image-based representation for real-time distant objects rendering: Reshadable Impostors with Level-of- Detail (RiLoD). By storing compact geometric and material information captured from a few reference views, RiLoD enables reliable forward mapping to generate target views under dynamic lighting and edited material attributes. In addition, it supports seamless transitions across different levels of detail. To support reshading and LoD simultaneously while maintaining a minimal memory footprint and bandwidth requirement, our key design is a compact yet efficient representation that encodes and compresses the necessary material and geometric information in each reference view. To further improve the visual fidelity, we use a reliable forward mapping technique combined with a hole-filling filtering strategy to ensure geometric completeness and shading consistency. We demonstrate the practicality of RiLoD by integrating it into a modern real-time renderer. RiLoD delivers fast performance across a variety of test scenes, supports smooth transitions between levels of detail as the camera moves closer or farther, and avoids the typical artifacts of impostor techniques that result from neglecting the underlying geometry.
  • Item
    Wavelet Representation and Sampling of Complex Luminaires
    (The Eurographics Association and John Wiley & Sons Ltd., 2025) Atanasov, Asen; Koylazov, Vladimir; Wang, Beibei; Wilkie, Alexander
    We contribute a technique for rendering the illumination of complex luminaires based on wavelet-compressed light fields while the direct appearance of the luminaire is handled with previous techniques. During a brief photon tracing phase, we precompute the radiance field of the luminaire. Then, we employ a compression scheme which is designed to facilitate fast per-ray run-time reconstructions of the field and importance sampling. To treat aliasing, we propose a two-component filtering solution: a 4D Gaussian filter during the pre-computation stage and a 4D stochastic Gaussian filter during rendering. We have developed an importance sampling strategy based on providing an initial guess from low-resolution and low-memory viewpoint samplers that is subsequently refined by a hierarchical process over the wavelet frequency bands. Our technique is straightforward to integrate in rendering systems and has all the features that make it practical for production renderers - MIS compatibility, brief pre-computation, low memory requirements, and efficient field evaluation and importance sampling.