• Login
    View Item 
    •   Eurographics DL Home
    • Computer Graphics Forum
    • Volume 41 (2022)
    • 41-Issue 7
    • View Item
    •   Eurographics DL Home
    • Computer Graphics Forum
    • Volume 41 (2022)
    • 41-Issue 7
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Eye-Tracking-Based Prediction of User Experience in VR Locomotion Using Machine Learning

    Thumbnail
    View/Open
    v41i7pp589-599.pdf (4.405Mb)
    ProjektDeal version (4.493Mb)
    Date
    2022
    Author
    Gao, Hong
    Kasneci, Enkelejda
    Pay-Per-View via TIB Hannover:

    Try if this item/paper is available.

    Metadata
    Show full item record
    Abstract
    VR locomotion is one of the most important design features of VR applications and is widely studied. When evaluating locomotion techniques, user experience is usually the first consideration, as it provides direct insights into the usability of the locomotion technique and users' thoughts about it. In the literature, user experience is typically measured with post-hoc questionnaires or surveys, while users' behavioral (i.e., eye-tracking) data during locomotion, which can reveal deeper subconscious thoughts of users, has rarely been considered and thus remains to be explored. To this end, we investigate the feasibility of classifying users experiencing VR locomotion into L-UE and H-UE (i.e., low- and high-user-experience groups) based on eye-tracking data alone. To collect data, a user study was conducted in which participants navigated a virtual environment using five locomotion techniques and their eye-tracking data was recorded. A standard questionnaire assessing the usability and participants' perception of the locomotion technique was used to establish the ground truth of the user experience. We trained our machine learning models on the eye-tracking features extracted from the time-series data using a sliding window approach. The best random forest model achieved an average accuracy of over 0.7 in 50 runs. Moreover, the SHapley Additive exPlanations (SHAP) approach uncovered the underlying relationships between eye-tracking features and user experience, and these findings were further supported by the statistical results. Our research provides a viable tool for assessing user experience with VR locomotion, which can further drive the improvement of locomotion techniques. Moreover, our research benefits not only VR locomotion, but also VR systems whose design needs to be improved to provide a good user experience.
    BibTeX
    @article {10.1111:cgf.14703,
    journal = {Computer Graphics Forum},
    title = {{Eye-Tracking-Based Prediction of User Experience in VR Locomotion Using Machine Learning}},
    author = {Gao, Hong and Kasneci, Enkelejda},
    year = {2022},
    publisher = {The Eurographics Association and John Wiley & Sons Ltd.},
    ISSN = {1467-8659},
    DOI = {10.1111/cgf.14703}
    }
    URI
    https://doi.org/10.1111/cgf.14703
    https://diglib.eg.org:443/handle/10.1111/cgf14703
    Collections
    • 41-Issue 7

    Eurographics Association copyright © 2013 - 2023 
    Send Feedback | Contact - Imprint | Data Privacy Policy | Disable Google Analytics
    Theme by @mire NV
    System hosted at  Graz University of Technology.
    TUGFhA
     

     

    Browse

    All of Eurographics DLCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

    My Account

    LoginRegister

    Statistics

    View Usage Statistics

    BibTeX | TOC

    Create BibTeX Create Table of Contents

    Eurographics Association copyright © 2013 - 2023 
    Send Feedback | Contact - Imprint | Data Privacy Policy | Disable Google Analytics
    Theme by @mire NV
    System hosted at  Graz University of Technology.
    TUGFhA