2 results
Search Results
Now showing 1 - 2 of 2
Item Downsampling and Storage of Pre-Computed Gradients for Volume Rendering(The Eurographics Association, 2017) Díaz-García, Jesús; Brunet, Pere; Navazo, Isabel; Vázquez, Pere-Pau; Fco. Javier Melero and Nuria PelechanoThe way in which gradients are computed in volume datasets influences both the quality of the shading and the performance obtained in rendering algorithms. In particular, the visualization of coarse datasets in multi-resolution representations is affected when gradients are evaluated on-the-fly in the shader code by accessing neighbouring positions. This is not only a costly computation that compromises the performance of the visualization process, but also one that provides gradients of low quality that do not resemble the originals as much as desired because of the new topology of downsampled datasets. An obvious solution is to pre-compute the gradients and store them. Unfortunately, this originates two problems: First, the downsampling process, that is also prone to generate artifacts. Second, the limited bit size of storage itself causes the gradients to loss precision. In order to solve these issues, we propose a downsampling filter for pre-computed gradients that provides improved gradients that better match the originals such that the aforementioned artifacts disappear. Secondly, to address the storage problem, we present a method for the efficient storage of gradient directions that is able to minimize the minimum angle achieved among all representable vectors in a space of 3 bytes. We also provide several examples that show the advantages of the proposed approaches.Item Improved Intuitive Appearance Editing based on Soft PCA(The Eurographics Association, 2017) Malpica, Sandra; Barrio, Miguel; Gutierrez, Diego; Serrano, Ana; Masia, Belen; Fco. Javier Melero and Nuria PelechanoDuring the last few years, many different techniques for measuring material appearance have arisen. These advances have allowed the creation of large public datasets, and new methods for editing BRDFs of captured appearance have been proposed. However, these methods lack intuitiveness and are hard to use for novice users. In order to overcome these limitations, Serrano et al. [SGM 16] recently proposed an intuitive space for editing captured appearance. They make use of a representation of the BRDF based on a combination of principal components (PCA) to reduce dimensionality, and then map these components to perceptual attributes. This PCA representation is biased towards specular materials and fails to represent very diffuse BRDFs, therefore producing unpleasant artifacts when editing. In this paper, we build on top of their work and propose to use two separate PCA bases for representing specular and diffuse BRDFs, and map each of these bases to the perceptual attributes. This allows us to avoid artifacts when editing towards diffuse BRDFs. We then propose a new method for effectively navigate between both bases while editing based on a new measurement of the specularity of measured materials. Finally, we integrate our proposed method in an intuitive BRDF editing framework and show how some of the limitations of the previous model have been overcomed with our representation. Moreover, our new measure of specularity can be used on any measured BRDF, as it is not limited only to MERL BRDFs [MPBM03].