Sztrajman, AlejandroRainer, GillesRitschel, TobiasWeyrich, TimBenes, Bedrich and Hauser, Helwig2021-10-082021-10-0820211467-8659https://doi.org/10.1111/cgf.14335https://diglib.eg.org:443/handle/10.1111/cgf14335Controlled capture of real‐world material appearance yields tabulated sets of highly realistic reflectance data. In practice, however, its high memory footprint requires compressing into a representation that can be used efficiently in rendering while remaining faithful to the original. Previous works in appearance encoding often prioritized one of these requirements at the expense of the other, by either applying high‐fidelity array compression strategies not suited for efficient queries during rendering, or by fitting a compact analytic model that lacks expressiveness. We present a compact neural network‐based representation of BRDF data that combines high‐accuracy reconstruction with efficient practical rendering via built‐in interpolation of reflectance. We encode BRDFs as lightweight networks, and propose a training scheme with adaptive angular sampling, critical for the accurate reconstruction of specular highlights. Additionally, we propose a novel approach to make our representation amenable to importance sampling: rather than inverting the trained networks, we learn to encode them in a more compact embedding that can be mapped to parameters of an analytic BRDF for which importance sampling is known. We evaluate encoding results on isotropic and anisotropic BRDFs from multiple real‐world datasets, and importance sampling performance for isotropic BRDFs mapped to two different analytic models.Neural BRDF Representation and Importance Sampling10.1111/cgf.14335332-346