• Login
    View Item 
    •   Eurographics DL Home
    • Eurographics Workshops and Symposia
    • EGVE: Eurographics Workshop on Virtual Environments
    • ICAT-EGVE2021
    • View Item
    •   Eurographics DL Home
    • Eurographics Workshops and Symposia
    • EGVE: Eurographics Workshop on Virtual Environments
    • ICAT-EGVE2021
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Deep Learning on Eye Gaze Data to Classify Student Distraction Level in an Educational VR Environment -- Honorable Mention for Best Paper Award

    Thumbnail
    View/Open
    037-046.pdf (4.743Mb)
    Date
    2021
    Author
    Asish, Sarker Monojit ORCID
    Hossain, Ekram ORCID
    Kulshreshth, Arun K. ORCID
    Borst, Christoph W.
    Pay-Per-View via TIB Hannover:

    Try if this item/paper is available.

    Metadata
    Show full item record
    Abstract
    Educational VR may increase engagement and retention compared to traditional learning, for some topics or students. However, a student could still get distracted and disengaged due to stress, mind-wandering, unwanted noise, external alerts, etc. Student eye gaze can be useful for detecting distraction. For example, we previously considered gaze visualizations to help teachers understand student attention to better identify or guide distracted students. However, it is not practical for a teacher to monitor a large numbers of student indicators while teaching. To help filter students based on distraction level, we consider a deep learning approach to detect distraction from gaze data. The key aspects are: (1) we created a labeled eye gaze dataset (3.4M data points) from an educational VR environment, (2) we propose an automatic system to gauge a student's distraction level from gaze data, and (3) we apply and compare three deep neural classifiers for this purpose. A proposed CNN-LSTM classifier achieved an accuracy of 89.8% for classifying distraction, per educational activity section, into one of three levels.
    BibTeX
    @inproceedings {10.2312:egve.20211326,
    booktitle = {ICAT-EGVE 2021 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
    editor = {Orlosky, Jason and Reiners, Dirk and Weyers, Benjamin},
    title = {{Deep Learning on Eye Gaze Data to Classify Student Distraction Level in an Educational VR Environment -- Honorable Mention for Best Paper Award}},
    author = {Asish, Sarker Monojit and Hossain, Ekram and Kulshreshth, Arun K. and Borst, Christoph W.},
    year = {2021},
    publisher = {The Eurographics Association},
    ISSN = {1727-530X},
    ISBN = {978-3-03868-142-7},
    DOI = {10.2312/egve.20211326}
    }
    URI
    https://doi.org/10.2312/egve.20211326
    https://diglib.eg.org:443/handle/10.2312/egve20211326
    Collections
    • ICAT-EGVE2021

    Eurographics Association copyright © 2013 - 2023 
    Send Feedback | Contact - Imprint | Data Privacy Policy | Disable Google Analytics
    Theme by @mire NV
    System hosted at  Graz University of Technology.
    TUGFhA
     

     

    Browse

    All of Eurographics DLCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

    My Account

    LoginRegister

    Statistics

    View Usage Statistics

    BibTeX | TOC

    Create BibTeX Create Table of Contents

    Eurographics Association copyright © 2013 - 2023 
    Send Feedback | Contact - Imprint | Data Privacy Policy | Disable Google Analytics
    Theme by @mire NV
    System hosted at  Graz University of Technology.
    TUGFhA