Practical Radiometric Compensation for Projection Display on Textured Surfaces using a Multidimensional Model

dc.contributor.authorLi, Yuqien_US
dc.contributor.authorMajumder, Aditien_US
dc.contributor.authorGopi, Meenakshisundaramen_US
dc.contributor.authorWang, Chongen_US
dc.contributor.authorZhao, Jieyuen_US
dc.contributor.editorGutierrez, Diego and Sheffer, Allaen_US
dc.date.accessioned2018-04-14T18:24:56Z
dc.date.available2018-04-14T18:24:56Z
dc.date.issued2018
dc.description.abstractRadiometric compensation methods remove the effect of the underlying spatially varying surface reflectance of the texture when projecting on textured surfaces. All prior work sample the surface reflectance dependent radiometric transfer function from the projector to the camera at every pixel that requires the camera to observe tens or hundreds of images projected by the projector. In this paper, we cast the radiometric compensation problem as a sampling and reconstruction of multi-dimensional radiometric transfer function that models the color transfer function from the projector to an observing camera and the surface reflectance in a unified manner. Such a multi-dimensional representation makes no assumption about linearity of the projector to camera color transfer function and can therefore handle projectors with non-linear color transfer functions(e.g. DLP, LCOS, LED-based or laser-based).We show that with a well-curated sampling of this multi-dimensional function, achieved by exploiting the following key properties, is adequate for its accurate representation: (a) the spectral reflectance of most real-world materials are smooth and can be well-represented using a lower-dimension function; (b) the reflectance properties of the underlying texture have strong redundancies –- for example, multiple pixels or even regions can have similar surface reflectance; (c) the color transfer function from the projector to camera have strong input coherence. The proposed sampling allows us to reduce the number of projected images that needs to be observed by a camera by up to two orders of magnitude, the minimum being only two. We then present a new multi-dimensional scattered data interpolation technique to reconstruct the radiometric transfer function at a high spatial density (i.e. at every pixel) to compute the compensation image. We show that the accuracy of our interpolation technique is higher than any existing methods.en_US
dc.description.number2
dc.description.sectionheadersImage Magic
dc.description.seriesinformationComputer Graphics Forum
dc.description.volume37
dc.identifier.doi10.1111/cgf.13368
dc.identifier.issn1467-8659
dc.identifier.pages365-375
dc.identifier.urihttps://doi.org/10.1111/cgf.13368
dc.identifier.urihttps://diglib.eg.org:443/handle/10.1111/cgf13368
dc.publisherThe Eurographics Association and John Wiley & Sons Ltd.en_US
dc.subjectComputing methodologies
dc.subjectImage processing
dc.subjectMixed / augmented reality
dc.titlePractical Radiometric Compensation for Projection Display on Textured Surfaces using a Multidimensional Modelen_US
Files
Collections