Show simple item record

dc.contributor.authorPonton, Jose Luisen_US
dc.contributor.authorMonclús, Evaen_US
dc.contributor.authorPelechano, Nuriaen_US
dc.contributor.editorPelechano, Nuriaen_US
dc.contributor.editorVanderhaeghe, Daviden_US
dc.description.abstractThe use of self-avatars in a VR application can enhance presence and embodiment which leads to a better user experience. In collaborative VR it also facilitates non-verbal communication. Currently it is possible to track a few body parts with cheap trackers and then apply IK methods to animate a character. However, the correspondence between trackers and avatar joints is typically fixed ad-hoc, which is enough to animate the avatar, but causes noticeable mismatches between the user's body pose and the avatar. In this paper we present a fast and easy to set up system to compute exact offset values, unique for each user, which leads to improvements in avatar movement. Our user study shows that the Sense of Embodiment increased significantly when using exact offsets as opposed to fixed ones. We also allowed the users to see a semitransparent avatar overlaid with their real body to objectively evaluate the quality of the avatar movement with our technique.en_US
dc.publisherThe Eurographics Associationen_US
dc.rightsAttribution 4.0 International License
dc.subjectCCS Concepts: Human-centered computing --> User models; Virtual reality
dc.subjectHuman centered computing
dc.subjectUser models
dc.subjectVirtual reality
dc.titleAvatarGo: Plug and Play self-avatars for VRen_US
dc.description.seriesinformationEurographics 2022 - Short Papers
dc.description.sectionheadersAnimation and Simulation
dc.identifier.pages4 pages

Files in this item


This item appears in the following Collection(s)

Show simple item record

Attribution 4.0 International License
Except where otherwise noted, this item's license is described as Attribution 4.0 International License