Scheel, ChristianIslam, A. B. M. TariqulStaadt, OliverDirk Reiners and Daisuke Iwai and Frank Steinicke2016-12-072016-12-072016978-3-03868-012-31727-530Xhttps://doi.org/10.2312/egve.20161427https://diglib.eg.org:443/handle/10.2312/egve20161427We present a first attempt to use interpolation based approach to combine a mobile eye tracker with an external tracking system to obtain a 3D gaze vector for a freely moving user. Our method captures calibration points of varying distances, pupil positions and head positions/orientations while the user can move freely within the range of the external tracking system. For this approach, it is not necessary to know the position of the eye or the orientation of the eye coordinate system. In addition to the calibration of the external tracking system, we can calibrate the head-tracked eye tracker in a one-step process which only requires the user to look at the calibration points. Here, we don’t need any extra calibration of the eye tracker, because the raw pupil position from the eye tracker can be used. Moreover, we use low cost tracking hardware which might be affordable to a wide range of application setups. Our experiment and evaluation show that the average accuracy of the visual angle is better than 0.85 degree under unrestrained head movement with a relatively low cost system.I.4.8 [Image Processing and Computer Vision]Scene AnalysisTrackingSensor fusionI.3.6 [Computer Graphics]Methodology and TechniquesInteraction techniquesAn Efficient Interpolation Approach for Low Cost Unrestrained Gaze Tracking in 3D Space10.2312/egve.201614271-8