Tefera, Yonas T.Mazzanti, DarioAnastasi, SaraCaldwell, Darwin G.Fiorini, PaoloDeshpande, NikhilHideaki UchiyamaJean-Marie Normand2022-11-292022-11-292022978-3-03868-179-31727-530Xhttps://doi.org/10.2312/egve.20221278https://diglib.eg.org:443/handle/10.2312/egve20221278Rapidly growing modern virtual reality (VR) interfaces are increasingly used as visualization and interaction media in 3D telepresence systems. Remote environments scanned using RGB-D cameras and represented as dense point-clouds are being used to visualize remote environments in VR in real-time to increase the user's immersion. To this end, such interfaces require high quality, low latency, and high throughput transmission. In other words, the entire system pipeline from data acquisition to its visualization in VR has to be optimized for high performance. Point-cloud data particularly suffers from network latency and throughput limitations that negatively impact user experience in telepresence. The human visual system provides an insight into approaching these challenges. Human eyes have their sharpest visual acuity at the center of their field-of-view, and this falls off towards the periphery. This visual acuity fall-off was taken as an inspiration to design a novel immersive 3D data visualization framework to facilitate the processing, transmission, and visualization in VR of dense point-clouds. The proposed FoReCast framework, shows significant reductions in latency and throughput, higher than 60% in both. A preliminary user study shows that the framework does not significantly affect the user quality of experience in immersive remote telepresence.Attribution 4.0 International LicenseCCS Concepts: Human-centered computing -> Visualization design and evaluation methods; Mixed / augmented realityHuman centered computingVisualization design and evaluation methodsMixed / augmented realityFoReCast: Real-time Foveated Rendering and Unicasting for Immersive Remote Telepresence10.2312/egve.2022127875-8410 pages