Towards Environment- and Task-Independent Locomotion Prediction for Haptic VR

dc.contributor.authorVarzandeh, Shokoofehen_US
dc.contributor.authorVasylevska, Khrystynaen_US
dc.contributor.authorVonach, Emanuelen_US
dc.contributor.authorKaufmann, Hannesen_US
dc.contributor.editorHasegawa, Shoichien_US
dc.contributor.editorSakata, Nobuchikaen_US
dc.contributor.editorSundstedt, Veronicaen_US
dc.date.accessioned2024-11-29T06:42:35Z
dc.date.available2024-11-29T06:42:35Z
dc.date.issued2024
dc.description.abstractThe use of robots presenting physical props has significantly enhanced the haptic experience in virtual reality. Autonomous mobile robots made haptic interaction in large walkable virtual environments feasible but brought new challenges. For effective operation, a mobile robot must not only track the user but also predict her future position for the next several seconds to be able to plan and navigate in the common space safely and timely. This paper presents a novel environment- and taskindependent concept for locomotion-based prediction of the user position within a chosen range. Our approach supports the dynamic placement of haptic content with minimum restrictions. We validate it based on a real use case by making predictions within a range of 2 m to 4 m or 2 s to 5 s. We also discuss the adaptation to arbitrary space sizes and configurations with minimal real data collection. Finally, we suggest optimal utilization strategies and discuss the limitations of our approach.en_US
dc.description.sectionheadersHaptics
dc.description.seriesinformationICAT-EGVE 2024 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments
dc.identifier.doi10.2312/egve.20241356
dc.identifier.isbn978-3-03868-245-5
dc.identifier.issn1727-530X
dc.identifier.pages10 pages
dc.identifier.urihttps://doi.org/10.2312/egve.20241356
dc.identifier.urihttps://diglib.eg.org/handle/10.2312/egve20241356
dc.publisherThe Eurographics Associationen_US
dc.rightsAttribution 4.0 International License
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.subjectCCS Concepts: Human-centered computing → Virtual reality; Interaction techniques
dc.subjectHuman centered computing → Virtual reality
dc.subjectInteraction techniques
dc.titleTowards Environment- and Task-Independent Locomotion Prediction for Haptic VRen_US
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
egve20241356.pdf
Size:
15.57 MB
Format:
Adobe Portable Document Format
Collections