Dluzniewski, ClémentChekirou, HakimGarrec, Jérémie LeAndriot, ClaudeNoël, FrédéricJean-Marie NormandMaki SugimotoVeronica Sundstedt2023-12-042023-12-042023978-3-03868-218-91727-530Xhttps://doi.org/10.2312/egve.20231325https://diglib.eg.org:443/handle/10.2312/egve20231325Nowadays, most volumetric tele-immersion systems are based on a multi-camera system to capture a dynamic 3D place. With these acquisition devices, the development of a mobile tele-immersion system seems compromised, as a lot of equipment would have to be moved. One promising way to achieve a mobile system would be to use a single 360° camera and develop ways of reconstructing in 3D a dynamic scene in real time from a single point of view. Therefore, we propose an approach to freely navigate into a 360° video captured with a static camera. The approach considers three types of elements in the scene, the environment, the object of interest and the people, and relies on a different 3D representation for each type of element. Distinguishing the scene elements enables a real-time method to be adopted, by reconstructing static elements once and using fast-computable 3D representations for dynamic elements. As the method is real-time, we develop a streaming pipeline to enable XR users to move live within the camera stream.Attribution 4.0 International LicenseCCS Concepts: Computing methodologies → Virtual reality; ReconstructionComputing methodologies → Virtual realityReconstruction3D Reconstruction for Tele-Immersion in 360° Live Stream10.2312/egve.20231325167-1759 pages