Asish, Sarker MonojitHossain, EkramKulshreshth, Arun K.Borst, Christoph W.Orlosky, Jason and Reiners, Dirk and Weyers, Benjamin2021-09-072021-09-072021978-3-03868-142-71727-530Xhttps://doi.org/10.2312/egve.20211326https://diglib.eg.org:443/handle/10.2312/egve20211326Educational VR may increase engagement and retention compared to traditional learning, for some topics or students. However, a student could still get distracted and disengaged due to stress, mind-wandering, unwanted noise, external alerts, etc. Student eye gaze can be useful for detecting distraction. For example, we previously considered gaze visualizations to help teachers understand student attention to better identify or guide distracted students. However, it is not practical for a teacher to monitor a large numbers of student indicators while teaching. To help filter students based on distraction level, we consider a deep learning approach to detect distraction from gaze data. The key aspects are: (1) we created a labeled eye gaze dataset (3.4M data points) from an educational VR environment, (2) we propose an automatic system to gauge a student's distraction level from gaze data, and (3) we apply and compare three deep neural classifiers for this purpose. A proposed CNN-LSTM classifier achieved an accuracy of 89.8% for classifying distraction, per educational activity section, into one of three levels.Computing methodologiesDeep learningVirtual realityApplied computingEducationDeep Learning on Eye Gaze Data to Classify Student Distraction Level in an Educational VR Environment -- Honorable Mention for Best Paper Award10.2312/egve.2021132637-46