10 results
Search Results
Now showing 1 - 10 of 10
Item Applications of High Precision Imaging Polarimetry(The Eurographics Association, 2008) Neumann, Laszlo; Hegedus, Ramon; Horváth, Gábor; Garcia, Rafael; Douglas W. Cunningham and Victoria Interrante and Paul Brown and Jon McCormackWe propose the use of imaging polarimetry for general photography, which is a relatively young technique allowing the determination of polarized components of the light coming from extended objects or scenes. In this paper high resolution and accurate methods are introduced to determine the two linearly polarized components (Q;U) of light. The CIE Luv color space is used in this work to visualize the triplet of (I;Q;U) polarization image components. The structure of this color space is also highly appropriate to represent other attributes of linearly polarized light, such as the polarized intensity, degree and the angle of polarization. The accurately measured polarization components can also be efficiently used for image enhancement. In this direction, a new, polarization-based de-reflection method is proposed. This method is an optimal pixel-wise extension of the widely used photographical polarization filtering. Our method is also capable of amplifying the specular effects. Another application is de-hazing, which removes the linearly polarized component of the haze present in natural scenes, and results in a sharp and color-corrected image. Furthermore, the different combinations of visible and infrared polarization channels enable great possibilities in further de-hazing and to create artistic images.Item An Analysis of Quasi-Monte Carlo Integration Applied to the Transillumination Radiosity Method(Blackwell Publishers Ltd and the Eurographics Association, 1997) Szirmay-Kalos, Laszlo; Foris, Tibor; Neumann, Laszlo; Csebfalvi, BalazsThis paper presents an enhanced transillumination radiosity method that can provide accurate solutions at relatively low computational cost. The proposed algorithm breaks down the double integral of the gathered power to an area integral that is computed analytically and to a directional integral that is evaluated by quasi-Monte Carlo techniques. Since the analytical integration results in a continuous function of finite variation, the quasi-Monte Carlo integration that follows the analytical integration will be efficient and its error can be bounded by the Koksma-Hlawka inequality. The paper also analyses the requirements of the convergence, presents theoretical error bounds and proposes error reduction techniques. The theoretical bounds are compared with simulation results.Item Incident Light Metering in Computer Graphics(Blackwell Publishers Ltd and the Eurographics Association, 1998) Neumann, Laszlo; Matkovic, Kresimir; Neumann, Attila; Purgathofer, WernerEvery rendering process consists of two steps. The first is the computing of luminance values by methods like ray tracing or radiosity, and the second step is the mapping of the computed values to values appropriate for displaying. In the last years, as alternative to simple linear scaling which maps the average value to the medium luminance, some new ways of mapping were introduced. These new methods are based on photography analogies and on human vision models. All existing methods follow, implicitly or explicitly, the reflected light metering principle. The method introduced in this paper is the first that follows the incident light metering used in professional photography and in the movie industry. Actually the irradiances are measured using a set of diffusors, which are placed automatically in the scene, and a linear scale factor based on these measurements is used to map the computed radiances to the display device. The diffusors act as half space integrators, they collect the light energy from all half space directions. The light comes from the primary light sources, or it is the result of various interreflections. The newly introduced method reproduces original colors faithfully even for scenes with very low or very high average reflectivity.Item Perception Based Color Image Difference(Blackwell Publishers Ltd and the Eurographics Association, 1998) Neumann, Laszlo; Matkovic, Kresimir; Purgathofer, WernerA good image metric is often needed in digital image synthesis. It can be used to check the convergence behavior in progressive methods, to compare images rendered using various rendering methods etc. Since images are rendered to be observed by humans, an image metric should correspond to human perception as well. We propose here a new algorithm which operates in the original image space. There is no need for Fourier or wavelet transforms. Furthermore, the new metric is view distance dependent. The new method uses the contrast sensitivity function. The main idea is to place a number of various rectangles in images, and to compute the CIE LUV average color difference between corresponding rectangles. Errors are then weighted according to the rectangle size and the contrast sensitivity function.Item Reflectance Models with Fast Importance Sampling(Blackwell Publishers Ltd and the Eurographics Association, 1999) Neumann, Laszlo; Neumann, Attila; Szirmay-Kalos, LaszloWe introduce a physically plausible mathematical model for a large class of BRDFs. The new model is as simple as the well-known Phong model, but eliminates its disadvantages. It gives a good visual approximation for many practical materials: coated metals, plastics, ceramics, retro-reflective paints, anisotropic and retro-reflective materials, etc. Because of its illustrative properties it can be used easily in most commercial software and because of its low computational cost it is practical for virtual reality. The model is based on a special basic BRDF definition, which meets the requirements of reciprocity and of energy conservation. Then a class of BRDFs is constructed from this basic BRDF with different weight functions. The definition of such weight functions requires the user to specify the profile of the highlights, from which the weight function is obtained by derivation. It is also demonstrated how importance sampling can be used with the new BRDFs.Item Computational Aesthetics 2005 Eurographics Workshop on Computational Aesthetics in Graphics, Visualization and Imaging Girona, Spain, 18-20 May 2005(The Eurographics Association and Blackwell Publishing Ltd., 2006) Neumann, Laszlo; Sbert, Mateu; Gooch, Bruce; Purgathofer, WernerItem An Efficient Perception-based Adaptive Color to Gray Transformation(The Eurographics Association, 2007) Neumann, Laszlo; Cadik, Martin; Nemcsics, Antal; Douglas W. Cunningham and Gary Meyer and Laszlo NeumannThe visualization of color images in gray scale has high practical and theoretical importance. Neither the existing local, gradient based methods, nor the fast global techniques give a satisfying result. We present a new color to grayscale transformation, based on the experimental background of the Coloroid system observations. We regard the color and luminance contrasts as a gradient field and we introduce a new simple, yet very efficient method to solve the inconsistency of the field. Having a consistent gradient field, we obtain the resultant image via fast direct integration. The complexity of the method is linear in the number of pixels, making it fast and suitable for high resolution images.Item Compact Metallic Reflectance Models(Blackwell Publishers Ltd and the Eurographics Association, 1999) Neumann, Laszlo; Neumannn, Attila; Szirmay-Kalos, LaszloThe paper presents simple, physically plausible, but not physically based reflectance models for metals and other specular materials. So far there has been no metallic BRDF model that is easy to compute, suitable for fast importance sampling and is physically plausible. This gap is filled by appropriate modifications of the Phong, Blinn and the Ward models. The Phong and the Blinn models are known not to have metallic characteristics. On the other hand, this paper also shows that the Cook-Torrance and the Ward models are not physically plausible, because of their behavior at grazing angles. We also compare the previous and the newly proposed models. Finally, the generated images demonstrate how the metallic impression can be provided by the new models.Item Gradient Estimation in Volume Data using 4D Linear Regression(Blackwell Publishers Ltd and the Eurographics Association, 2000) Neumann, Laszlo; Csebfalvi, Balazs; Konig, Andreas; Groller, EduardIn this paper a new gradient estimation method is presented which is based on linear regression. Previous contextual shading techniques try to fit an approximate function to a set of surface points in the neighborhood of a given voxel. Therefore a system of linear equations has to be solved using the computationally expensive Gaussian elimination. In contrast, our method approximates the density function itself in a local neighborhood with a 3D regression hyperplane. This approach also leads to a system of linear equations but we will show that it can be solved with an efficient convolution. Our method provides at each voxel location the normal vector and the translation of the regression hyperplane which are considered as a gradient and a filtered density value respectively. Therefore this technique can be used for surface smoothing and gradient estimation at the same time.Item Radiosity with Well Distributed Ray Sets(Blackwell Publishers Ltd and the Eurographics Association, 1997) Neumann, Laszlo; Neumann, Attila; Bekaert, PhilippeIn this paper we present a new radiosity algorithm, based on the notion of a well distributed ray set (WDRS). A WDRS is a set of rays, connecting mutually visible points and patches, that forms an approximate representation of the radiosity operator and the radiosity distribution. We propose an algorithm that constructs an optimal WDRS for a given accuracy and mesh. The construction is based on discrete importance sampling as in previously proposed stochastic radiosity algorithms, and on quasi Monte Carlo sampling. Quasi Monte Carlo sampling leads to faster convergence rates and the fact that the sampling is deterministic makes it possible to represent the well distributed ray set very efficiently in computer memory. Like previously proposed stochastic radiosity algorithms, the new algorithm is well suited for computing the radiance distribution in very complex diffuse scenes, when it is not feasible to explicitly compute and store form factors as in classical radiosity algorithms. Experiments show that the new algorithm is often more efficient than previously proposed Monte Carlo radiosity algorithms by half an order of magnitude and more.