Perception of Highlight Disparity at a Distance in Consumer Head-Mounted Displays

dc.contributor.authorToth, Roberten_US
dc.contributor.authorHasselgren, Jonen_US
dc.contributor.authorAkenine-Möller, Tomasen_US
dc.contributor.editorPetrik Clarberg and Elmar Eisemannen_US
dc.date.accessioned2016-01-19T10:32:48Z
dc.date.available2016-01-19T10:32:48Z
dc.date.issued2015en_US
dc.description.abstractStereo rendering for 3D displays and for virtual reality headsets provide several visual cues, including convergence angle and highlight disparity. The human visual system interprets these cues to estimate surface properties of the displayed environment. Naïve stereo rendering effectively doubles the computational burden of image synthesis, and thus it is desirable to reuse as many computations as possible between the stereo image pair. Computing a single radiance for a point on a surface, to be used when synthesizing both the left and right images, results in the loss of highlight disparity. Our hypothesis is that absence of highlight disparity does not impair perception of surface properties at larger distances. This is due to an ever decreasing angular difference between the surface and the two view points as distance to the surface is increased. The effect is exacerbated by the limited resolution of consumer head-mounted displays. We verify this hypothesis with a user study and provide rendering guidelines to leverage our findings.en_US
dc.description.sectionheadersRendering and Displayen_US
dc.description.seriesinformationHigh-Performance Graphicsen_US
dc.identifier.doi10.1145/2790060.2790062en_US
dc.identifier.isbn978-1-4503-3707-6en_US
dc.identifier.pages61-66en_US
dc.identifier.urihttps://doi.org/10.1145/2790060.2790062en_US
dc.publisherACM Siggraphen_US
dc.subjectcomputer graphicsen_US
dc.subjectvirtual realityen_US
dc.subjectstereoscopic renderingen_US
dc.subjectpsychophysical user studyen_US
dc.titlePerception of Highlight Disparity at a Distance in Consumer Head-Mounted Displaysen_US
Files