Show simple item record

dc.contributor.authorMatsumoto, Keigoen_US
dc.contributor.authorMuta, Masahumien_US
dc.contributor.authorCheng, Kelvinen_US
dc.contributor.authorMasuko, Sohen_US
dc.contributor.editorTony Huang and Arindam Deyen_US
dc.date.accessioned2017-11-21T15:42:14Z
dc.date.available2017-11-21T15:42:14Z
dc.date.issued2017
dc.identifier.isbn978-3-03868-052-9
dc.identifier.urihttp://dx.doi.org/10.2312/egve.20171374
dc.identifier.urihttps://diglib.eg.org:443/handle/10.2312/egve20171374
dc.description.abstractAlong with the spread of augmented reality (AR) using head-mounted display or smart glass, attempts have been made to present information by superimposing information on people and things. In general, people are always moving about and usually do not stay stationary, so it is conceivable that the superimposed AR information also moves with them. However, it is often difficult to follow and select moving targets.We propose two novel techniques, TagToPlace and TagAlong, which help users select moving targets using head orientation. We conducted a user study to compare our proposed techniques to a conventional gaze selection method - DwellTime. The results showed that our proposed techniques are superior to a conventional one in terms of throughput when selecting moving targets.en_US
dc.publisherThe Eurographics Associationen_US
dc.subjectH.5.1 [INFORMATION INTERFACES AND PRESENTATION (e.g.
dc.subjectHCI)]
dc.subjectMultimedia Information Systems
dc.subjectArtificial
dc.subjectaugmented
dc.subjectand virtual realities
dc.titleSelecting Moving Targets in AR using Head Orientationen_US
dc.description.seriesinformationICAT-EGVE 2017 - Posters and Demos
dc.description.sectionheadersPosters B
dc.identifier.doi10.2312/egve.20171374
dc.identifier.pages21-22


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record