Huang, TaoYuan, JunPingHu, RuikeWang, LuGuo, YanwenChen, BinGuo, JieZhu, JunqiuWang, BeibeiWilkie, Alexander2025-06-202025-06-2020251467-8659https://doi.org/10.1111/cgf.70176https://diglib.eg.org/handle/10.1111/cgf70176Realistic hair rendering remains a significant challenge in computer graphics due to the intricate microstructure of hair fibers and their anisotropic scattering properties, which make them highly sensitive to noise. Although recent advancements in imagespace and 3D-space denoising and antialiasing techniques have facilitated real-time rendering in simple scenes, existing methods still struggle with excessive blurring and artifacts, particularly in fine hair details such as flyaway strands. These issues arise because current techniques often fail to preserve sub-pixel continuity and lack directional sensitivity in the filtering process. To address these limitations, we introduce a novel real-time hair filtering technique that effectively reconstructs fine fiber details while suppressing noise. Our method improves visual quality by maintaining strand-level details and ensuring computational efficiency, making it well-suited for real-time applications in video games and virtual reality (VR) and augmented reality (AR) environments.Attribution 4.0 International LicenseCCS Concepts: Computing methodologies → Collision detection; Hardware → Sensors and actuators; PCB design and layoutComputing methodologies → Collision detectionHardware → Sensors and actuatorsPCB design and layoutDetail-Preserving Real-Time Hair Strand Linking and Filtering10.1111/cgf.7017611 pages