Schubert, RyanBruder, GerdWelch, GregoryBruder, Gerd and Yoshimoto, Shunsuke and Cobb, Sue2018-11-062018-11-062018978-3-03868-058-11727-530Xhttps://doi.org/10.2312/egve.20181316https://diglib.eg.org:443/handle/10.2312/egve20181316Spatial Augmented Reality (SAR), e.g., based on monoscopic projected imagery on physical three-dimensional (3D) surfaces, can be particularly well-suited for ad hoc group or multi-user augmented reality experiences since it does not encumber users with head-worn or carried devices. However, conveying a notion of realistic 3D shapes and movements on SAR surfaces using monoscopic imagery is a difficult challenge. While previous work focused on physical actuation of such surfaces to achieve geometrically dynamic content, we introduce a different concept, which we call ''Synthetic Animatronics,'' i.e., conveying geometric movement or deformation purely through manipulation of the imagery being shown on a static display surface. We present a model for the distribution of the viewpoint-dependent distortion that occurs when there are discrepancies between the physical display surface and the virtual object being represented, and describe a realtime implementation for a method of adaptively filtering the imagery based on an approximation of expected potential error. Finally, we describe an existing physical SAR setup well-suited for synthetic animatronics and a corresponding Unity-based SAR simulator allowing for flexible exploration and validation of the technique and various parameters.Computing methodologiesRenderingMixed / augmented realityPerceptionSimulation support systemsAdaptive Filtering of Physical-Virtual Artifacts for Synthetic Animatronics10.2312/egve.2018131665-72