Fountain, JakeSmith, Shamus P.Robert W. Lindeman and Gerd Bruder and Daisuke Iwai2017-11-212017-11-212017978-3-03868-038-31727-530Xhttps://doi.org/10.2312/egve.20171331https://diglib.eg.org:443/handle/10.2312/egve20171331Cross-compatibility of virtual reality devices is limited by the difficulty of alignment and fusion of data between systems. In this paper, a plugin for ambiently aligning the reference frames of virtual reality tracking systems is presented. The core contribution consists of a procedure for ambient calibration. The procedure describes ambient behaviors for data gathering, system calibration and fault detection. Data is ambiently collected from in-application self-directed movements, and calibration is automatically performed between dependent sensor systems. Sensor fusion is then performed by taking the most accurate data for a given body part amongst all systems. The procedure was applied to aligning a Kinect v2 with an HTC Vive and an Oculus Rift in a variety of common virtual reality scenarios. The results were compared to alignment performed with a gold standard OptiTrack motion capture system. Typical results were 20cm and 4 of error compared to the ground truth, which compares favorably with the accepted accuracy of the Kinect v2. Data collection for full calibration took on average 13 seconds of inapplication, self-directed movement. This work represents an essential development towards plug-and-play sensor fusion for virtual reality technology.Computing methodologiesTrackingCamera calibrationComputer systems organizationRealtime system architectureSoftware and its engineeringSoftware libraries and repositoriesReal-time Ambient Fusion of Commodity Tracking Systems for Virtual Reality10.2312/egve.201713311-8