ICAT-EGVE2017
Permanent URI for this collection
Browse
Browsing ICAT-EGVE2017 by Issue Date
Now showing 1 - 20 of 34
Results Per Page
Sort Options
Item Tour de Tune - Auditory-game-motor Synchronisation in Exergames(The Eurographics Association, 2017) Finlayson, Jenna; Peterson, Jamie; Free, Joshua; Lo, Michael; Shaw, Lindsay A.; Lutteroth, Christof; Wünsche, Burkhard C.; Robert W. Lindeman and Gerd Bruder and Daisuke IwaiExergaming has been heralded as a promising approach to increase physical activity in hard-to-reach populations such as sedentary young adults. By combining physical activity with entertainment, researchers and developers hope that the excitement and immersion provided by a computer game will result in increased motivation and dissociation from the discomfort of physical exercise. A different approach to improve physical activity is the use of music. Music, in particular if synchronised with the rhythm of exercise, has been shown to increase performance and decrease the amount of perceived effort for the same performance. So far little research has been done on the combined effect of music and gameplay in exergaming. In this paper we investigate the effect of game-music synchronisation for an immersive exergame. We present a simple yet effective music analysis algorithm, and a novel exergame enabling synchronisation of gameplay with the music's intensity. Our results indicate that our exergame significantly increases enjoyment and motivation compared to music alone. It slightly increases performance, but also increases perceived effort. We did not find any significant differences between gameplay synchronised and not synchronised with the music. Our results confirm the positive effects of music while exercising, but suggest that gameplay might have a bigger effect on exergame effectiveness, and more research on the interaction between gameplay and music needs to be done.Item Viewpoint-Dependent Appearance-Manipulation with Multiple Projector-Camera Systems(The Eurographics Association, 2017) Amano, Toshiyuki; Ushida, Shun; Miyabayashi, Yusuke; Robert W. Lindeman and Gerd Bruder and Daisuke IwaiThis paper proposes a novel projection display technique that realizes viewing-direction-dependent appearance-manipulation. The proposed method employs a multiple projector-camera feedback system, and each projector-camera system simultaneously manipulates the apparent color or contrast from the different viewing directions. Since we assume the mirror reflection is a dominant component, we placed the camera on the counter side of the projector for the system. We confirmed that our multiple projector-camera system enables viewpoint-dependent appearance-manipulation on an anisotropic reflection surface by the experimental results. Interestingly, the application target is not limited to a metallic surface, and we have confirmed that it can be applied to matte paper media for glossy ink reflection.Item Towards Precise, Fast and Comfortable Immersive Polygon Mesh Modelling: Capitalising the Results of Past Research and Analysing the Needs of Professionals(The Eurographics Association, 2017) Ladwig, Philipp; Herder, Jens; Geiger, Christian; Robert W. Lindeman and Gerd Bruder and Daisuke IwaiMore than three decades of ongoing research in immersive modelling has revealed many advantages of creating objects in virtual environments. Even though there are many benefits, the potential of immersive modelling has only been partly exploited due to unresolved problems such as ergonomic problems, numerous challenges with user interaction and the inability to perform exact, fast and progressive refinements. This paper explores past research, shows alternative approaches and proposes novel interaction tools for pending problems. An immersive modelling application for polygon meshes is created from scratch and tested by professional users of desktop modelling tools, such as Autodesk Maya, in order to assess the efficiency, comfort and speed of the proposed application with direct comparison to professional desktop modelling tools.Item Won by a Head: A Platform Comparison of Smart Object Linking in Virtual Environments(The Eurographics Association, 2017) Ens, Barrett; Anderson, Fraser; Grossman, Tovi; Annett, Michelle; Irani, Pourang; Fitzmaurice, George; Robert W. Lindeman and Gerd Bruder and Daisuke IwaiMixed -reality platforms and toolkits are now more accessible than ever, bringing a renewed interest in interactive mixed-reality applications. However, more research is required to determine which available platforms are best suited for different situated tasks. This paper presents a user study that compares headworn and handheld platforms with a smart object linking task in interactive virtual environments. These platforms both have potential benefits for supporting spatial interaction for uses situated in the spatial context of the objects being connected. Results show that the immersive, headworn platform has several benefits over the handheld tablet, including better performance and user experience. Findings also show that semantic knowledge about a spatial environment can provide advantages over abstract object identifiers.Item Real-time Ambient Fusion of Commodity Tracking Systems for Virtual Reality(The Eurographics Association, 2017) Fountain, Jake; Smith, Shamus P.; Robert W. Lindeman and Gerd Bruder and Daisuke IwaiCross-compatibility of virtual reality devices is limited by the difficulty of alignment and fusion of data between systems. In this paper, a plugin for ambiently aligning the reference frames of virtual reality tracking systems is presented. The core contribution consists of a procedure for ambient calibration. The procedure describes ambient behaviors for data gathering, system calibration and fault detection. Data is ambiently collected from in-application self-directed movements, and calibration is automatically performed between dependent sensor systems. Sensor fusion is then performed by taking the most accurate data for a given body part amongst all systems. The procedure was applied to aligning a Kinect v2 with an HTC Vive and an Oculus Rift in a variety of common virtual reality scenarios. The results were compared to alignment performed with a gold standard OptiTrack motion capture system. Typical results were 20cm and 4 of error compared to the ground truth, which compares favorably with the accepted accuracy of the Kinect v2. Data collection for full calibration took on average 13 seconds of inapplication, self-directed movement. This work represents an essential development towards plug-and-play sensor fusion for virtual reality technology.Item The Effect of User Embodiment in AV Cinematic Experience(The Eurographics Association, 2017) Chen, Joshua; Lee, Gun A.; Billinghurst, Mark; Lindeman, Robert W.; Bartneck, Christoph; Robert W. Lindeman and Gerd Bruder and Daisuke IwaiVirtual Reality (VR) is becoming a popular medium for viewing immersive cinematic experiences using 360° panoramic movies and head mounted displays. There are previous research on user embodiment in real-time rendered VR, but not in relation to cinematic VR based on 360° panoramic video. In this paper we explore the effects of introducing the user's real body into cinematic VR experiences. We conducted a study evaluating how the type of movie and user embodiment affects the sense of presence and user engagement. We found that when participants were able to see their own body in the VR movie, there was significant increase in the sense of Presence, yet user engagement was not significantly affected. We discuss on the implications of the results and how it can be expanded in the future.Item Enjoyment, Immersion, and Attentional Focus in a Virtual Reality Exergame with Differing Visual Environments(The Eurographics Association, 2017) Abernathy, Michael; Shaw, Lindsay A.; Lutteroth, Christof; Buckley, Jude; Corballis, Paul M.; Wünsche, Burkhard C.; Robert W. Lindeman and Gerd Bruder and Daisuke IwaiVirtual reality exergames provide a compelling distraction from the possible discomfort and negative perception of exercise by immersing users in three dimensional virtual worlds. Prior studies have looked at the effects of immersion in exergames, from the technologies used, to gameplay elements, to sensory stimulation. This study examines the level of immersion and distraction caused by various visual environments, including urban, rural, and desert landscapes, and the effects on users' performance, enjoyment, and motivation. The environments were found to have little effect on the user. It appears that the core gameplay elements have a far greater effect, being essential for the immersion a user experiences.Item Real-time Visual Representations for Mixed Reality Remote Collaboration(The Eurographics Association, 2017) Gao, Lei; Bai, Huidong; Piumsomboon, Thammathip; Lee, Gun A.; Lindeman, Robert W.; Billinghurst, Mark; Robert W. Lindeman and Gerd Bruder and Daisuke IwaiWe present a prototype Mixed Reality (MR) system with a hybrid interface to support remote collaboration between a local worker and a remote expert in a large-scale work space. By combining a low-resolution 3D point-cloud of the environment surrounding the local worker with a high-resolution real-time view of small focused details, the remote expert can see a virtual copy of the local workspace with an independent viewpoint control. Meanwhile, the export can also check the current actions of the local worker through a real-time feedback view. We conducted a pilot study to evaluate the usability of our system by comparing the performance of three different interface designs (showing the real-time view in forms of 2D first-person view, a 2D third-person view and a 3D point cloud view). We found no difference in average task performance time between the three interfaces, but there was a difference in user preference.Item Facial Performance Capture by Embedded Photo Reflective Sensors on A Smart Eyewear(The Eurographics Association, 2017) Asano, Nao; Masai, Katsutoshi; Sugiura, Yuta; Sugimoto, Maki; Robert W. Lindeman and Gerd Bruder and Daisuke IwaiFacial performance capture is used for animation production that projects a performer's facial expression to a computer graphics model. Retro-reflective markers and cameras are widely used for the performance capture. To capture expressions, we need to place markers on the performer's face and calibrate the intrinsic and extrinsic parameters of cameras in advance. However, the measurable space is limited to the calibrated area. In this paper, we propose a system to capture facial performance using a smart eyewear with photo reflective sensors and machine learning technique.Item Collaborative View Configurations for Multi-user Interaction with a Wall-size Display(The Eurographics Association, 2017) Kim, Hyungon; Kim, Yeongmi; Lee, Gun A.; Billinghurst, Mark; Bartneck, Christoph; Robert W. Lindeman and Gerd Bruder and Daisuke IwaiThis paper explores the effects of different collaborative view configuration on face-to-face collaboration using a wall-size display and the relationship between view configuration and multi-user interaction. Three different view configurations (shared view, split screen, and split screen with navigation information) for multi-user collaboration with a wall-size display were introduced and evaluated in a user study. From the experiment results, several insights for designing a virtual environment with a wall-size display were discussed. The shared view configuration does not disturb collaboration despite control conflict and can provide an effective collaboration. The split screen view configuration can provide independent collaboration while it can take users' attention. The navigation information can reduce the interaction required for the navigational task while an overall interaction performance may not increase.Item Assessing the Relevance of Eye Gaze Patterns During Collision Avoidance in Virtual Reality(The Eurographics Association, 2017) Varma, Kamala; Guy, Stephen J.; Interrante, Victoria; Robert W. Lindeman and Gerd Bruder and Daisuke IwaiTo increase presence in virtual reality environments requires a meticulous imitation of human behavior in virtual agents. In the specific case of collision avoidance, agents' interaction will feel more natural if they are able to both display and respond to non-verbal cues. This study informs their behavior by analyzing participants' reaction to nonverbal cues. Its aim is to confirm previous work that shows head orientation to be a primary factor in collision avoidance negotiation, and to extend this to investigate the additional contribution of eye gaze direction as a cue. Fifteen participants were directed to walk towards an oncoming agent in a virtual hallway, who would exhibit various combinations of head orientation and eye gaze direction based cues. Closely prior to the potential collision the display turned black and the participant had to move in avoidance of the agent as if she were still present. Meanwhile, their own eye gaze was tracked to identify where their focus was directed and how it related to their response. Results show that the natural tendency was to avoid the agent by moving right. However, participants showed a greater compulsion to move leftward if the agent cued her own movement to the participant's right, whether through head orientation cues (consistent with previous work) or through eye gaze direction cues (extending previous work). The implications of these findings are discussed.Item Reference Framework on vSRT-method Benchmarking for MAR(The Eurographics Association, 2017) Ichikari, Ryosuke; Kurata, Takeshi; Makita, Koji; Taketomi, Takafumi; Uchiyama, Hideaki; Kondo, Tomotsugu; Mori, Shohei; Shibata, Fumihisa; Robert W. Lindeman and Gerd Bruder and Daisuke IwaiThis paper presents a reference framework on benchmarking of vision-based spatial registration and tracking (vSRT) methods for Mixed and Augmented Reality (MAR). This framework can provide typical benchmarking processes, benchmark indicators, and trial set elements that are necessary to successfully identify, define, design, select, and apply benchmarking of vSRT methods for MAR. In addition, we summarize findings from benchmarking activities for sharing how to organize and conduct on-site and off-site competition.Item Effects of Personalized Avatar Texture Fidelity on Identity Recognition in Virtual Reality(The Eurographics Association, 2017) Thomas, Jerald; Azmandian, Mahdi; Grunwald, Sonia; Le, Donna; Krum, David; Kang, Sin-Hwa; Rosenberg, Evan Suma; Robert W. Lindeman and Gerd Bruder and Daisuke IwaiRecent advances in 3D scanning, reconstruction, and animation techniques have made it possible to rapidly create photorealistic avatars based on real people. While it is now possible to create personalized avatars automatically with consumer-level technology, their visual fidelity still falls far short of 3D avatars created with professional cameras and manual artist effort. To evaluate the importance of investing resources in the creation of high-quality personalized avatars, we conducted an experiment to investigate the effects of varying their visual texture fidelity, specifically focusing on identity recognition of specific individuals. We designed two virtual reality experimental scenarios: (1) selecting a specific avatar from a virtual lineup and (2) searching for an avatar in a virtual crowd. Our results showed that visual fidelity had a significant impact on participants' abilities to identify specific avatars from a lineup wearing a head-mounted display. We also investigated gender effects for both the participants and the confederates from which the avatars were created.Item A New Approach to Utilize Augmented Reality on Precision Livestock Farming(The Eurographics Association, 2017) Zhao, Zongyuan; Yang, Wenli; Chinthammit, Winyu; Rawnsley, Richard; Neumeyer, Paul; Cahoon, Stephen; Robert W. Lindeman and Gerd Bruder and Daisuke IwaiThis paper proposes a new method that utilizes AR to assist pasture-based dairy farmers identify and locate animal within large herds. Our proposed method uses GPS collars on cows and digital camera and on-board GPS on a mobile device to locate a selected cow and show the behavioral and other associated key metrics on our mobile application. The augmented cow's information shown on real scene video steam will help users (farmers) manage their animals with respect to welfare, health, and management interventions. By integrating GPS data with computer vision (CV) and machine learning, our mobile AR application has two major functions: 1. Searching a cow by its unique ID, and 2. Displaying information associated with a selected cow visible on screen. Our proof-of-concept application shows the potential of utilizing AR in precision livestock farming.Item Exploring Pupil Dilation in Emotional Virtual Reality Environments(The Eurographics Association, 2017) Chen, Hao; Dey, Arindam; Billinghurst, Mark; Lindeman, Robert W.; Robert W. Lindeman and Gerd Bruder and Daisuke IwaiPrevious investigations have shown that pupil dilation can be affected by emotive pictures, audio clips, and videos. In this paper, we explore how emotive Virtual Reality (VR) content can also cause pupil dilation. VR has been shown to be able to evoke negative and positive arousal in users when they are immersed in different virtual scenes. In our research, VR scenes were used as emotional triggers. Five emotional VR scenes were designed in our study and each scene had five emotion segments; happiness, fear, anxiety, sadness, and disgust. When participants experienced the VR scenes, their pupil dilation and the brightness in the headset were captured. We found that both the negative and positive emotion segments produced pupil dilation in the VR environments. We also explored the effect of showing heart beat cues to the users, and if this could cause difference in pupil dilation. In our study, three different heart beat cues were shown to users using a combination of three channels; haptic, audio, and visual. The results showed that the haptic-visual cue caused the most significant pupil dilation change from the baseline.Item A Mutual Motion Capture System for Face-to-face Collaboration(The Eurographics Association, 2017) Nakamura, Atsuyuki; Kiyokawa, Kiyoshi; Ratsamee, Photchara; Mashita, Tomohiro; Uranishi, Yuki; Takemura, Haruo; Robert W. Lindeman and Gerd Bruder and Daisuke IwaiIn recent years, motion capture technology to measure the movement of the body has been used in many fields. Moreover, motion capture targeting multiple people is becoming necessary in multi-user virtual reality (VR) and augmented reality (AR) environments. It is desirable that motion capture requires no wearable devices to capture natural motion easily. Some systems require no wearable devices using an RGB-D camera fixed in the environment, but the user has to stay in front of the fixed the RGB-D camera. Therefore, in this research, proposed is a motion capture technique for a multi-user VR / AR environment using head mounted displays (HMDs), that does not limit the working range of the user nor require any wearable devices. In the proposed technique, an RGB-D camera is attached to each HMD and motion capture is carried out mutually. The motion capture accuracy is improved by modifying the depth image. A prototype system has been implemented to evaluate the effectiveness of the proposed method and motion capture accuracy has been compared with two conditions, with and without depth information correction while rotating the RGB-D camera. As a result, it was confirmed that the proposed method could decrease the number of frames with erroneous motion capture by 49% to 100% in comparison with the case without depth image conversion.Item Development of Olfactory Display Using Solenoid Valves Controlled Atomization for High Concentration Scent Emission(The Eurographics Association, 2017) Ariyakul, Yossiri; Robert W. Lindeman and Gerd Bruder and Daisuke IwaiThis paper reports on the introduction of using atomization technique controlled by high-speed switching solenoid valves to present smells. Even though atomization has been widely used to release smells in commercial aroma diffusers, intensity of the released odor cannot be controlled. In this paper, the high speed ON/OFF switching of the solenoid valves enables the capability to control odor intensity precisely and rapidly and the atomization enables emission of high concentration odors compared with odors generated from natural evaporation method. The proposed olfactory display was evaluated by using an odor sensing system composed of a quartz crystal microbalance (QCM) gas sensor. As a result, the reproducibility and the capability to present high concentration odors with adjustable intensity of the proposed olfactory display were confirmed.Item Archives of Thrill: The V-Armchair Experience(The Eurographics Association, 2017) Passmore, Peter J.; Tennent, Paul; Walker, Brendan; Philpot, Adam; Le, Ha; Markowski, Marianne; Karamanoglu, Mehmet; Robert W. Lindeman and Gerd Bruder and Daisuke IwaiTechnology for older people is typically concerned either with health care or accessibility of existing systems. In this paper we take a more 'entertainment-oriented' approach to developing experiences aimed at older users. We describe here the design, development and a user study of the V-Armchair, a virtual reality and motion platform based roller coaster experience. The V-Armchair constitutes a blueprint for the digital archiving of physical ride experiences through the simultaneous capture of 360 video, sound and motion. It gives access to thrill experiences to those who may not be able to go on real thrill rides, such as older riders, and it can be considered as a class of technology that could help to support 'active aging' as defined by the World Health Organisation. We discuss strategies for capturing and then 'toning down' motion experiences to make them accessible for older users. We present a study which explores the user experience of the V-Armchair with an older group (median age 63) using a DK2 headset, and a younger group (median age 25) using a CV1 headset, via thematic analysis of semi-structured interviews and a modified version of the Game Experience Questionnaire, and discuss emergent themes such as the role of the presenter, reminiscence, presence and immersion.Item Ethical Considerations for the Use of Virtual Reality: An Evaluation of Practices in Academia and Industry(The Eurographics Association, 2017) Luro, Francisco Lopez; Prada, Diego Navarro; Sundstedt, Veronica; Robert W. Lindeman and Gerd Bruder and Daisuke IwaiThe following article offers a set of recommendations that are considered relevant for designing and executing experiences with Virtual Reality (VR) technology. It presents a brief review of the history and evolution of VR, along with the physiological issues related to its use. Additionally, typical practices in VR, used by both academia and industry are discussed and contrasted. These were further analysed from an ethical perspective, guided by legal and Corporate Social Responsibility (CSR) frameworks, to understand their motivation and goals, and the rights and responsibilities related to the exposure of research participants and final consumers to VR. Our results showed that there is a significant disparity between practices in academia and industry, and for industry specifically, there can be breaches of user protection regulations and poor ethical practices. The differences found are mainly in regards to the type of content presented, the overall setup of VR experiences, and the amount of information provided to participants or consumers respectively. To contribute to this issue, this study highlights some ethical aspects and also offers practical considerations that aim, not only to have more appropriate practices with VR in public spaces but also to motivate a discussion and reflection to ease the adoption of this technology in the consumer market.Item Dwarf or Giant: The Influence of Interpupillary Distance and Eye Height on Size Perception in Virtual Environments(The Eurographics Association, 2017) Kim, Jangyoon; Interrante, Victoria; Robert W. Lindeman and Gerd Bruder and Daisuke IwaiThis paper addresses the question: to what extent can deliberate manipulations of interpupillary distance (IPD) and eye height be used in a virtual reality (VR) experience to influence a user's sense of their own scale with respect to their surrounding environment - evoking, for example, the illusion of being miniaturized, or of being a giant? In particular, we report the results of an experiment in which we separately study the effect of each of these body scale manipulations on users' perception of object size in a highly detailed, photorealistically rendered immersive virtual environment, using both absolute numeric measures and body-relative actions. Following a real world training session, in which participants learn to accurately report the metric sizes of individual white cubes (3''-20'') presented one at a time on a table in front of them, we conduct two blocks of VR trials using nine different combinations of IPD and eye height. In the first block of trials, participants report the perceived metric size of a virtual white cube that sits on a virtual table, at the same distance used in the real-world training, within in a realistic virtual living room filled with many objects capable of providing familiar size cues. In the second block of trials, participants use their hands to indicate the perceived size of the cube. We found that size judgments were moderately correlated (r = 0.4) between the two response methods, and that neither altered eye height (± 50cm) nor reduced (10mm) IPD had a significant effect on size judgments, but that a wider (150mm) IPD caused a significant (μ = 38%, p < 0.01) decrease in perceived cube size. These findings add new insights to our understanding of how eye height and IPD manipulations can affect peoples' perception of scale in highly realistic immersive VR scenarios.