Mylo, MarlonGiesel, MartinZaidi, QasimHullin, MatthiasKlein, ReinhardMatthias Hullin and Reinhard Klein and Thomas Schultz and Angela Yao2017-09-252017-09-252017978-3-03868-049-9https://doi.org/10.2312/vmv.20171254https://diglib.eg.org:443/handle/10.2312/vmv20171254Data-driven representations of material appearance play an important role in a wide range of applications. Unlike with analytical models, however, the intuitive and efficient editing of tabulated reflectance data is still an open problem. In this work, we introduce appearance bending, a set of image-based manipulation operators, such as thicken, inflate, and roughen, that implement recent insights from perceptual studies. In particular, we exploit a link between certain perceived visual properties of a material, and specific bands in its spectrum of spatial frequencies or octaves of a wavelet decomposition. The result is an editing interface that produces plausible results at interactive rates, even for drastic manipulations. We present the effectiveness of our method on a database of bidirectional texture functions (BTFs) for a variety of material samples.Computing methodologiesImage processingImagebased renderingAppearance Bending: A Perceptual Editing Paradigm for Data-Driven Material Models10.2312/vmv.201712549-16