Mantiuk, RadoslawBazyluk, BartoszMantiuk, Rafal K.I. Navazo, P. Poulin2015-02-282015-02-2820131467-8659https://doi.org/10.1111/cgf.12036To efficiently deploy eye-tracking within 3D graphics applications, we present a new probabilistic method that predicts the patterns of user's eye fixations in animated 3D scenes from noisy eye-tracker data. The proposed method utilises both the eye-tracker data and the known information about the 3D scene to improve the accuracy, robustness and stability. Eye-tracking can thus be used, for example, to induce focal cues via gaze-contingent depth-of-field rendering, add intuitive controls to a video game, and create a highly reliable scene-aware saliency model. The computed probabilities rely on the consistency of the gaze scan-paths to the position and velocity of a moving or stationary target. The temporal characteristic of eye fixations is imposed by a Hidden Markov model, which steers the solution towards the most probable fixation patterns. The derivation of the algorithm is driven by the data from two eye-tracking experiments: the first experiment provides actual eye-tracker readings and the position of the target to be tracked. The second experiment is used to derive a JND-scaled (Just Noticeable Difference) quality metric that quantifies the perceived loss of quality due to the errors of the tracking algorithm. Data from both experiments are used to justify design choices, and to calibrate and validate the tracking algorithms. This novel method outperforms commonly used fixation algorithms and is able to track objects smaller then the nominal error of an eye-tracker.Computer Graphics [I.3.6]Methodology and TechniquesGaze-driven Object Tracking for Real Time Rendering