Search Results

Now showing 1 - 4 of 4
  • Item
    A Texture‐Free Practical Model for Realistic Surface‐Based Rendering of Woven Fabrics
    (Eurographics ‐ The European Association for Computer Graphics and John Wiley & Sons Ltd., 2025) Khattar, Apoorv; Zhu, Junqiu; Yan, Ling‐Qi; Montazeri, Zahra
    Rendering woven fabrics is challenging due to the complex micro geometry and anisotropy appearance. Conventional solutions either fully model every yarn/ply/fibre for high fidelity at a high computational cost, or ignore details, that produce non‐realistic close‐up renderings. In this paper, we introduce a model that shares the advantages of both. Our model requires only binary patterns as input yet offers all the necessary micro‐level details by adding the yarn/ply/fibre implicitly. Moreover, we design a double‐layer representation to handle light transmission accurately and use a constant timed () approach to accurately and efficiently depict parallax and shadowing‐masking effects in a tandem way. We compare our model with curve‐based and surface‐based, on different patterns, under different lighting and evaluate with photographs to ensure capturing the aforementioned realistic effects.
  • Item
    Real-time Level-of-detail Strand-based Rendering
    (The Eurographics Association and John Wiley & Sons Ltd., 2025) Huang, Tao; Zhou, Yang; Lin, Daqi; Zhu, Junqiu; Yan, Ling-Qi; Wu, Kui; Wang, Beibei; Wilkie, Alexander
    We present a real-time strand-based rendering framework that ensures seamless transitions between different level-of-detail (LoD) while maintaining a consistent appearance. We first introduce an aggregated BCSDF model to accurately capture both single and multiple scattering within the cluster for hairs and fibers. Building upon this, we further introduce a LoD framework for hair rendering that dynamically, adaptively, and independently replaces clusters of individual hairs with thick strands based on their projected screen widths. Through tests on diverse hairstyles with various hair colors and animation, as well as knit patches, our framework closely replicates the appearance of multiple-scattered full geometries at various viewing distances, achieving up to a 13× speedup.
  • Item
    Detail-Preserving Real-Time Hair Strand Linking and Filtering
    (The Eurographics Association and John Wiley & Sons Ltd., 2025) Huang, Tao; Yuan, JunPing; Hu, Ruike; Wang, Lu; Guo, Yanwen; Chen, Bin; Guo, Jie; Zhu, Junqiu; Wang, Beibei; Wilkie, Alexander
    Realistic hair rendering remains a significant challenge in computer graphics due to the intricate microstructure of hair fibers and their anisotropic scattering properties, which make them highly sensitive to noise. Although recent advancements in imagespace and 3D-space denoising and antialiasing techniques have facilitated real-time rendering in simple scenes, existing methods still struggle with excessive blurring and artifacts, particularly in fine hair details such as flyaway strands. These issues arise because current techniques often fail to preserve sub-pixel continuity and lack directional sensitivity in the filtering process. To address these limitations, we introduce a novel real-time hair filtering technique that effectively reconstructs fine fiber details while suppressing noise. Our method improves visual quality by maintaining strand-level details and ensuring computational efficiency, making it well-suited for real-time applications in video games and virtual reality (VR) and augmented reality (AR) environments.
  • Item
    Reshadable Impostors with Level-of-Detail for Real-Time Distant Objects Rendering
    (The Eurographics Association and John Wiley & Sons Ltd., 2025) Wu, Xiaoloong; Zeng, Zheng; Zhu, Junqiu; Wang, Lu; Wang, Beibei; Wilkie, Alexander
    We propose a new image-based representation for real-time distant objects rendering: Reshadable Impostors with Level-of- Detail (RiLoD). By storing compact geometric and material information captured from a few reference views, RiLoD enables reliable forward mapping to generate target views under dynamic lighting and edited material attributes. In addition, it supports seamless transitions across different levels of detail. To support reshading and LoD simultaneously while maintaining a minimal memory footprint and bandwidth requirement, our key design is a compact yet efficient representation that encodes and compresses the necessary material and geometric information in each reference view. To further improve the visual fidelity, we use a reliable forward mapping technique combined with a hole-filling filtering strategy to ensure geometric completeness and shading consistency. We demonstrate the practicality of RiLoD by integrating it into a modern real-time renderer. RiLoD delivers fast performance across a variety of test scenes, supports smooth transitions between levels of detail as the camera moves closer or farther, and avoids the typical artifacts of impostor techniques that result from neglecting the underlying geometry.