Kanno, MaoIsogawa, MarikoHasegawa, ShoichiSakata, NobuchikaSundstedt, Veronica2024-11-292024-11-292024978-3-03868-245-51727-530Xhttps://doi.org/10.2312/egve.20241367https://diglib.eg.org/handle/10.2312/egve20241367This paper proposes an eye-tracking system using a CNN-LSTM network that utilizes only event data. This method holds potential for future applications in a wide range of fields, including AR/VR headsets, healthcare, and sports. Compared to traditional frame-based camera methods, our proposed approach achieves high FPS and low power consumption by utilizing event cameras. To improve the estimation accuracy, our gaze estimation system incorporates a blink detection, which was absent in existing systems. Our results shows that our method achieves better performance compared to existing studies.Attribution 4.0 International LicenseLearning-based Event-based Human Gaze Tracking with Blink Detection10.2312/egve.202413675 pages