Show simple item record

dc.contributor.authorNakamura, Atsuyukien_US
dc.contributor.authorKiyokawa, Kiyoshien_US
dc.contributor.authorRatsamee, Photcharaen_US
dc.contributor.authorMashita, Tomohiroen_US
dc.contributor.authorUranishi, Yukien_US
dc.contributor.authorTakemura, Haruoen_US
dc.contributor.editorRobert W. Lindeman and Gerd Bruder and Daisuke Iwaien_US
dc.date.accessioned2017-11-21T15:42:36Z
dc.date.available2017-11-21T15:42:36Z
dc.date.issued2017
dc.identifier.isbn978-3-03868-038-3
dc.identifier.issn1727-530X
dc.identifier.urihttp://dx.doi.org/10.2312/egve.20171332
dc.identifier.urihttps://diglib.eg.org:443/handle/10.2312/egve20171332
dc.description.abstractIn recent years, motion capture technology to measure the movement of the body has been used in many fields. Moreover, motion capture targeting multiple people is becoming necessary in multi-user virtual reality (VR) and augmented reality (AR) environments. It is desirable that motion capture requires no wearable devices to capture natural motion easily. Some systems require no wearable devices using an RGB-D camera fixed in the environment, but the user has to stay in front of the fixed the RGB-D camera. Therefore, in this research, proposed is a motion capture technique for a multi-user VR / AR environment using head mounted displays (HMDs), that does not limit the working range of the user nor require any wearable devices. In the proposed technique, an RGB-D camera is attached to each HMD and motion capture is carried out mutually. The motion capture accuracy is improved by modifying the depth image. A prototype system has been implemented to evaluate the effectiveness of the proposed method and motion capture accuracy has been compared with two conditions, with and without depth information correction while rotating the RGB-D camera. As a result, it was confirmed that the proposed method could decrease the number of frames with erroneous motion capture by 49% to 100% in comparison with the case without depth image conversion.en_US
dc.publisherThe Eurographics Associationen_US
dc.subjectHuman
dc.subjectcentered computing
dc.subjectMixed/augmented reality
dc.subjectVirtual reality
dc.subjectCollaborative interaction
dc.titleA Mutual Motion Capture System for Face-to-face Collaborationen_US
dc.description.seriesinformationICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments
dc.description.sectionheadersTracking
dc.identifier.doi10.2312/egve.20171332
dc.identifier.pages9-16


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record