ICAT-EGVE2019 - Posters and DemosISBN 978-3-03868-097-0https://diglib.eg.org:443/handle/10.2312/26328052024-03-28T12:41:43Z2024-03-28T12:41:43ZVR Sickness Reduction in Stereoscopic Video Streaming System 'TwinCam' for a Remote ExperienceYagi, RyunosukeFujie, ToiAmemiya, TomohiroKitazaki, MichiteruYem, VibolIkei, Yasushihttps://diglib.eg.org:443/handle/10.2312/egve201913012022-03-28T07:05:27Z2019-01-01T00:00:00ZVR Sickness Reduction in Stereoscopic Video Streaming System 'TwinCam' for a Remote Experience
Yagi, Ryunosuke; Fujie, Toi; Amemiya, Tomohiro; Kitazaki, Michiteru; Yem, Vibol; Ikei, Yasushi
Kakehi, Yasuaki and Hiyama, Atsushi
In the present paper, a method to present remote stereoscopic vision with decreased VR sickness is discussed. Our omnidirectional stereoscopic video streaming system (TwinCam) is described introducing the merit of the design. One of the important features is VR sickness reduction which we evaluated by assessing the simulator sickness questionnaire comparing it with conventional parallel cameras design. The result revealed that the TwinCam has significantly suppressed VR sickness from the conventional parallel cameras, at the same level of a fixed monocular camera.
2019-01-01T00:00:00ZExpanding the Freedom of Eye-gaze Input Interface using Round-Trip Eye Movement under HMD EnvironmentMatsuno, ShogoSato, HironobuAbe, KiyohikoOhyama, Minoruhttps://diglib.eg.org:443/handle/10.2312/egve201912982022-03-28T07:05:24Z2019-01-01T00:00:00ZExpanding the Freedom of Eye-gaze Input Interface using Round-Trip Eye Movement under HMD Environment
Matsuno, Shogo; Sato, Hironobu; Abe, Kiyohiko; Ohyama, Minoru
Kakehi, Yasuaki and Hiyama, Atsushi
In this paper, we propose a specific gaze movement detection algorithm, which is necessary for implementing a gaze movement input interface using an HMD built-in eye tracking system. Most input devices used in current virtual reality and augmented reality are hand-held devices, hand gestures, head tracking and voice input, despite the HMD attachment type. Therefore, in order to use the eye expression as a hands-free input modality, we consider a gaze input interface that does not depend on the measurement accuracy of the measurement device. The proposed method generally assumes eye movement input different from eye gaze position input which is implemented using an eye tracking system. Specifically, by using reciprocation eye movement in an oblique direction as an input channel, it aims to realize an input method that does not block the view by a screen display and does not hinder the acquisition of other lines of sight meta information. Moreover, the proposed algorithm is actually implemented in HMD, and the detection accuracy of the roundtrip eye movement is evaluated by experiments. As a result, the detection results of 5 subjects were averaged to obtain 90% detection accuracy. The results show that it has enough accuracy to develop an input inter-face using eye movement.
2019-01-01T00:00:00ZNarrowcasting for Stereoscopic Photospherical CinemagraphyCohen, MichaelIida, TakatoSato, Rintarohttps://diglib.eg.org:443/handle/10.2312/egve201913032022-03-28T07:05:18Z2019-01-01T00:00:00ZNarrowcasting for Stereoscopic Photospherical Cinemagraphy
Cohen, Michael; Iida, Takato; Sato, Rintaro
Kakehi, Yasuaki and Hiyama, Atsushi
We have developed an application which blurs the distinction between static and dynamic imagery in a stereoscopic omnidirectional browser. A ''cinemagraph'' is a living picture, interpolating between a still photo and a video. A stereo omnidirectional camera can capture stereographic contents. Combining such functionality yields a photospherical cinemagraph. Runtime control of activation fields allows selective alternation between frozen and animated scene elements. Narrowcasting, a user interface idiom for selective activation, is used to alternate between static and moving imagery) . Presentation includes stereoscopic display (binocular channels) and spatial sound.
2019-01-01T00:00:00ZAugmented Dodgeball AR Viewer for SpectatorsAzuma, ShotaHertzog, ClaraSakurai, ShoHirota, KoichiNojima, Takuyahttps://diglib.eg.org:443/handle/10.2312/egve201913002022-03-28T07:05:24Z2019-01-01T00:00:00ZAugmented Dodgeball AR Viewer for Spectators
Azuma, Shota; Hertzog, Clara; Sakurai, Sho; Hirota, Koichi; Nojima, Takuya
Kakehi, Yasuaki and Hiyama, Atsushi
These last few years many systems and methods have been developed to provide information to spectators about a sport game such as baseball, basketball, soccer, etc. Among them, Augmented Sport is one of the emerging area that intends to merge video game concept into physical sports. This project focuses on merging game elements such as Health Points (HP), Attack Power (AP) and Defense Power (DP) to improve enjoyment and variety of players. During Augmented Dodgeball games, the spectators can visualize via the Mixed Reality device additional parameters such as HP, AP and DP of each player. This data is superimposed onto each physical players by virtue of AR markers they wear. To avoid marker occlusion issues, fixed camera(s) are also used to inquire player's physical information and share it via a database. Studies have been conducted in order to find out the best displaying design, methods and limits of the system.
2019-01-01T00:00:00Z