Sereno, MickaelBesançon, LonniIsenberg, TobiasMadeiras Pereira, João and Raidou, Renata Georgia2019-06-022019-06-022019978-3-03868-088-8https://doi.org/10.2312/eurp.20191136https://diglib.eg.org:443/handle/10.2312/eurp20191136We present our vision and steps toward implementing a collaborative 3D data analysis tool based on wearable Augmented Reality Head-Mounted Display (AR-HMD). We envision a hybrid environment which combines such AR-HMD devices with multi-touch devices to allow multiple collaborators to visualize and jointly discuss volumetric datasets. The multi-touch devices permit users to manipulate the datasets' states, either publicly or privately, while also proposing means for 2D input for, e. g., drawing annotations. The headsets allow each user to visualize the dataset in physically correct perspective stereoscopy, either in public or in their private space. The public space is viewed by all, with modifications shared in real-time. The private space allows each user to investigate the same dataset with their own preferences, for instance, with a different clipping range. The user can later decide to merge their private space with the public one or cancel the changes.Humancentered computingScientific visualizationInteraction techniquesSynchronous editorsSupporting Volumetric Data Visualization and Analysis by Combining Augmented Reality Visuals with Multi-Touch Input10.2312/eurp.2019113621-23