Wu, XiaoLoongXu, YanningWang, LuGarces, ElenaHaines, Eric2024-06-252024-06-2520241467-8659https://doi.org/10.1111/cgf.15159https://diglib.eg.org/handle/10.1111/cgf15159Screen Space Reflection (SSR) can reliably achieve highly efficient reflective effects, significantly enhancing users' sense of realism in real-time applications. However, when directly applied to stereo rendering, popular SSR algorithms lead to inconsistencies due to the differing information between the left and right eyes. This inconsistency, invisible to human vision, results in visual discomfort. This paper analyzes and demonstrates how screen-space geometries, fade boundaries, and reflection samples introduce inconsistent cues. Considering the complementary nature of screen information, we introduce a stereo-aware SSR method to alleviate visual discomfort caused by screen space disparities. By contrasting our stereo-aware SSR with conventional SSR and ray-traced results, we showcase the effectiveness of our approach in mitigating the inconsistencies stemming from screen space differences while introducing affordable performance overhead for real-time rendering.Attribution 4.0 International LicenseCCS Concepts: Computing methodologies → Rendering; Rendering → Real-Time Rendering; Screen space reflection; Stereo consistentComputing methodologies → RenderingRendering → RealTime RenderingScreen space reflectionStereo consistentStereo-consistent Screen Space Reflection10.1111/cgf.1515911 pages