• Login
    View Item 
    •   Eurographics DL Home
    • Eurographics Partner Events
    • VE: Eurographics Workshop on Virtual Environments - Short Papers
    • ICAT-EGVE2019 - Posters and Demos
    • View Item
    •   Eurographics DL Home
    • Eurographics Partner Events
    • VE: Eurographics Workshop on Virtual Environments - Short Papers
    • ICAT-EGVE2019 - Posters and Demos
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Expanding the Freedom of Eye-gaze Input Interface using Round-Trip Eye Movement under HMD Environment

    Thumbnail
    View/Open
    021-022.pdf (362.6Kb)
    Date
    2019
    Author
    Matsuno, Shogo
    Sato, Hironobu
    Abe, Kiyohiko
    Ohyama, Minoru
    Pay-Per-View via TIB Hannover:

    Try if this item/paper is available.

    Metadata
    Show full item record
    Abstract
    In this paper, we propose a specific gaze movement detection algorithm, which is necessary for implementing a gaze movement input interface using an HMD built-in eye tracking system. Most input devices used in current virtual reality and augmented reality are hand-held devices, hand gestures, head tracking and voice input, despite the HMD attachment type. Therefore, in order to use the eye expression as a hands-free input modality, we consider a gaze input interface that does not depend on the measurement accuracy of the measurement device. The proposed method generally assumes eye movement input different from eye gaze position input which is implemented using an eye tracking system. Specifically, by using reciprocation eye movement in an oblique direction as an input channel, it aims to realize an input method that does not block the view by a screen display and does not hinder the acquisition of other lines of sight meta information. Moreover, the proposed algorithm is actually implemented in HMD, and the detection accuracy of the roundtrip eye movement is evaluated by experiments. As a result, the detection results of 5 subjects were averaged to obtain 90% detection accuracy. The results show that it has enough accuracy to develop an input inter-face using eye movement.
    BibTeX
    @inproceedings {10.2312:egve.20191298,
    booktitle = {ICAT-EGVE 2019 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments - Posters and Demos},
    editor = {Kakehi, Yasuaki and Hiyama, Atsushi},
    title = {{Expanding the Freedom of Eye-gaze Input Interface using Round-Trip Eye Movement under HMD Environment}},
    author = {Matsuno, Shogo and Sato, Hironobu and Abe, Kiyohiko and Ohyama, Minoru},
    year = {2019},
    publisher = {The Eurographics Association},
    ISSN = {1727-530X},
    ISBN = {978-3-03868-097-0},
    DOI = {10.2312/egve.20191298}
    }
    URI
    https://doi.org/10.2312/egve.20191298
    https://diglib.eg.org:443/handle/10.2312/egve20191298
    Collections
    • ICAT-EGVE2019 - Posters and Demos

    Eurographics Association copyright © 2013 - 2023 
    Send Feedback | Contact - Imprint | Data Privacy Policy | Disable Google Analytics
    Theme by @mire NV
    System hosted at  Graz University of Technology.
    TUGFhA
     

     

    Browse

    All of Eurographics DLCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

    My Account

    LoginRegister

    Statistics

    View Usage Statistics

    BibTeX | TOC

    Create BibTeX Create Table of Contents

    Eurographics Association copyright © 2013 - 2023 
    Send Feedback | Contact - Imprint | Data Privacy Policy | Disable Google Analytics
    Theme by @mire NV
    System hosted at  Graz University of Technology.
    TUGFhA