Interactive Fusion and Tracking For Multi-Modal Spatial Data Visualization

dc.contributor.authorElshehaly, Maien_US
dc.contributor.authorGracanin, Denisen_US
dc.contributor.authorGad, Mohameden_US
dc.contributor.authorElmongui, Hicham G.en_US
dc.contributor.authorMatkovic, Kresimiren_US
dc.contributor.editorH. Carr, K.-L. Ma, and G. Santuccien_US
dc.date.accessioned2015-05-22T12:51:30Z
dc.date.available2015-05-22T12:51:30Z
dc.date.issued2015en_US
dc.description.abstractScientific data acquired through sensors which monitor natural phenomena, as well as simulation data that imitate time-identified events, have fueled the need for interactive techniques to successfully analyze and understand trends and patterns across space and time. We present a novel interactive visualization technique that fuses ground truth measurements with simulation results in real-time to support the continuous tracking and analysis of spatiotemporal patterns. We start by constructing a reference model which densely represents the expected temporal behavior, and then use GPU parallelism to advect measurements on the model and track their location at any given point in time. Our results show that users can interactively fill the spatio-temporal gaps in real world observations, and generate animations that accurately describe physical phenomena.en_US
dc.description.number3en_US
dc.description.sectionheadersMulti-modal and Multi-fielden_US
dc.description.seriesinformationComputer Graphics Forumen_US
dc.description.volume34en_US
dc.identifier.doi10.1111/cgf.12637en_US
dc.identifier.pages251-260en_US
dc.identifier.urihttps://doi.org/10.1111/cgf.12637en_US
dc.publisherThe Eurographics Association and John Wiley & Sons Ltd.en_US
dc.subjectI.3.6 [Computer Graphics]en_US
dc.subjectMethodology and Techniquesen_US
dc.subjectInteraction techniquesen_US
dc.titleInteractive Fusion and Tracking For Multi-Modal Spatial Data Visualizationen_US
Files