ICAT-EGVE2017 - Posters and DemosISBN 978-3-03868-052-9https://diglib.eg.org:443/handle/10.2312/26319032024-03-19T10:29:30Z2024-03-19T10:29:30ZEstimation of 3D Finger Postures with wearable device measuring Skin Deformation on Back of HandKuno, WakabaSugiura, YutaAsano, NaoKawai, WataruSugimoto, Makihttps://diglib.eg.org:443/handle/10.2312/egve201713842022-03-28T07:05:00Z2017-01-01T00:00:00ZEstimation of 3D Finger Postures with wearable device measuring Skin Deformation on Back of Hand
Kuno, Wakaba; Sugiura, Yuta; Asano, Nao; Kawai, Wataru; Sugimoto, Maki
Tony Huang and Arindam Dey
We propose a method for reconstructing hand posture by measuring the deformation of the back of the hand with a wearable device. Our method constructs a regression model by using the data on hand posture captured by a depth camera and data on the skin deformation of the back of the hand captured by several photo-reflective sensors attached to the wearable device. By using this regression model, the posture of the hand is reconstructed from the data of the photo-reflective sensors in real-time. The posture of fingers can be estimated without hindering the natural movement of the fingers since the deformation of the back of the hand is measured without directly measuring the position of the fingers. In our demonstration, users can reflect his / her own finger posture in a virtual environment.
2017-01-01T00:00:00ZComparative Evaluation of Sensor Devices for Micro-GesturesSimmons, H.Devi, R.Ens, BarrettBillinghurst, Markhttps://diglib.eg.org:443/handle/10.2312/egve201713822022-03-28T07:04:59Z2017-01-01T00:00:00ZComparative Evaluation of Sensor Devices for Micro-Gestures
Simmons, H.; Devi, R.; Ens, Barrett; Billinghurst, Mark
Tony Huang and Arindam Dey
This paper presents a comparative evaluation of two gesture recognition sensors and their ability to detect small, movements known as micro-gestures. In this work we explore the capabilities of these devices by testing if users can reliably use the sensors to select a target using a simple 1D user interface element. We implemented three distinct gestures, including a large gesture of moving the whole hand up and down; a smaller gesture of moving a finger up and down and; and a small movement of the thumb against the forefinger to represent a virtual slider. Demo participants will be able to experience these three gestures with to sensing devices, a Leap Motion and Google Soli.
2017-01-01T00:00:00ZHolo Worlds Infinite: Procedural Spatial Aware AR ContentLawrence, Louise M.Hart, Jonathon DerekBillinghurst, Markhttps://diglib.eg.org:443/handle/10.2312/egve201713832022-03-28T07:04:58Z2017-01-01T00:00:00ZHolo Worlds Infinite: Procedural Spatial Aware AR Content
Lawrence, Louise M.; Hart, Jonathon Derek; Billinghurst, Mark
Tony Huang and Arindam Dey
We developed an Augmented Reality (AR) application that procedurally generates content which is programmatically placed on the floor. It uses its awareness of its spatial surroundings to generate and place virtual content. We created a prototype that can be used as the basis of a city simulation game that can be played on the floor of any room space, but the approach could also be used for many other applications.
2017-01-01T00:00:00ZAn AR Network Cabling Tutoring System for Wiring a RackHerbert, B. M.Weerasinghe, A.Ens, BarrettBillinghurst, MarkWigley, G.https://diglib.eg.org:443/handle/10.2312/egve201713812022-03-28T07:05:02Z2017-01-01T00:00:00ZAn AR Network Cabling Tutoring System for Wiring a Rack
Herbert, B. M.; Weerasinghe, A.; Ens, Barrett; Billinghurst, Mark; Wigley, G.
Tony Huang and Arindam Dey
We present a network cabling tutoring system that guides learners through cabling a network topology by overlaying virtual icons and arrows on the ports. The system determines the network state by parsing switch output and does not depend on network protocols being functional. A server provides a web-based user interface and communicates with an external intelligent tutoring system called The Generalized Intelligent Framework for Tutoring. Users use a tablet to view AR annotations, though support for HoloLens HMD will be added soon.
2017-01-01T00:00:00Z