Search Results

Now showing 1 - 10 of 13
  • Item
    Efficient Rendering of Local Subsurface Scattering
    (The Eurographics Association and Blackwell Publishing Ltd., 2005) Mertens, Tom; Kautz, Jan; Bekaert, Philippe; Van Reeth, Frank; Seidel, Hans-Peter
    A novel approach is presented to efficiently render local subsurface scattering effects. We introduce an importance sampling scheme for a practical subsurface scattering model. It leads to a simple and efficient rendering algorithm, which operates in image space, and which is even amenable for implementation on graphics hardware. We demonstrate the applicability of our technique to the problem of skin rendering, for which the subsurface transport of light typically remains local. Our implementation shows that plausible images can be rendered interactively using hardware acceleration.
  • Item
    Accelerating Path Tracing by Re-Using Paths
    (The Eurographics Association, 2002) Bekaert, Philippe; Sbert, Mateu; Halton, John; P. Debevec and S. Gibson
    This paper describes a new acceleration technique for rendering algorithms like path tracing, that use so called gathering random walks. Usually in path tracing, each traced path is used in order to compute a contribution to only a single point on the virtual screen. We propose to combine paths traced through nearby screen points in such a way that each path contributes to multiple screen points in a provably good way. Our approach is unbiased and is not restricted to diffuse light scattering. It complements previous image noise reduction techniques for Monte Carlo ray tracing. We observe speed-ups in the computation of indirect illumination of one order of magnitude.
  • Item
    Interactive Rendering of Translucent Deformable Objects
    (The Eurographics Association, 2003) Mertens, Tom; Kautz, Jan; Bekaert, Philippe; Seidel, Hans-Peter; Reeth, Frank Van; Philip Dutre and Frank Suykens and Per H. Christensen and Daniel Cohen-Or
    Realistic rendering of materials such as milk, fruits, wax, marble, and so on, requires the simulation of subsurface scattering of light. This paper presents an algorithm for plausible reproduction of subsurface scattering effects. Unlike previously proposed work, our algorithm allows to interactively change lighting, viewpoint, subsurface scattering properties, as well as object geometry. The key idea of our approach is to use a hierarchical boundary element method to solve the integral describing subsurface scattering when using a recently proposed analytical BSSRDF model. Our approach is inspired by hierarchical radiosity with clustering. The success of our approach is in part due to a semi-analytical integration method that allows to compute needed point-to-patch form-factor like transport coefficients efficiently and accurately where other methods fail. Our experiments show that high-quality renderings of translucent objects consisting of tens of thousands of polygons can be obtained from scratch in fractions of a second. An incremental update algorithm further speeds up rendering after material or geometry changes.
  • Item
    Information Theory Tools for Scene Discretization
    (The Eurographics Association, 1999) Feixas, Miquel; Acebo, Esteve del; Bekaert, Philippe; Sbert, Mateu; Dani Lischinski and Greg Ward Larson
    Finding an optimal discretization of a scene is an important but difficult problem in radiosity. The efficiency of hierarchical radiosity for instance, depends entirely on the subdivision criterion and strategy that is used. We study the problem of adaptive scene discretization from the point of view of information theory. In previous work, we have introduced the concept of mutual information, which represents the information transfer or correlation in a scene, as a complexity measure and presented some intuitive arguments and preliminary results concerning the relation between mutual information and scene discretization. In this paper, we present a more general treatment supporting and extending our previous findings to the level that the development of practical information theory-based tools for optimal scene discretization becomes feasible.
  • Item
    Gathering for Free in RandomWalk Radiosity
    (The Eurographics Association, 1999) Sbert, Mateu; Brusi, Alex; Bekaert, Philippe; Dani Lischinski and Greg Ward Larson
    We present a simple technique that improves the efficiency of random walk algorithms for radiosity. Each generated random walk is used to simultaneously sample two distinct radiosity estimators. The first estimator is the commonly used shooting estimator, in which the radiosity due to self-emitted light at the origin of the random walk is recorded at each subsequently visited patch. With the second estimator, the radiosity due to self-emitted light at subsequent destinations is recorded at each visited patch. Closed formulae for the variance of the involved estimators allow to derive a cheap heuristic for combining the resulting radiosity estimates. Empirical results agree well with the heuristic prediction. A fair error reduction is obtained at a negligible additional cost.
  • Item
    Information-Theoretic Oracle Based on Kernel Smoothness for Hierarchical Radiosity
    (Eurographics Association, 2002) Feixas, Miquel; Rigau, Jaume; Bekaert, Philippe; Sbert, Mateu
    One of the main problems in the radiosity method is how to discretise the surfaces of a scene into mesh elements that allow us to accurately represent illumination. In this paper we present a robust information-theoretic refinement criterion (oracle) based on kernel smoothness for hierarchical radiosity. This oracle improves on previous ones in that at equal cost it gives a better discretisation, approaching the optimal one from an information theory point of view, and also needs less visibility computations for a similar image quality.
  • Item
    Deblurring by Matching
    (The Eurographics Association and Blackwell Publishing Ltd, 2009) Ancuti, Cosmin; Ancuti, Codruta Orniana; Bekaert, Philippe
    Restoration of the photographs damaged by the camera shake is a challenging task that manifested increasing attention in the recent period. Despite of the important progress of the blind deconvolution techniques, due to the ill-posed nature of the problem, the finest details of the kernel blur cannot be recovered entirely. Moreover, the additional constraints and prior assumptions make these approaches to be relative limited.In this paper we introduce a novel technique that removes the undesired blur artifacts from photographs taken by hand-held digital cameras. Our approach is based on the observation that in general several consecutive photographs taken by the users share image regions that project the same scene content. Therefore, we took advantage of additional sharp photographs of the same scene. Based on several invariant local feature points, filtered from the given blurred/non-blurred images, our approach matches the keypoints and estimates the blur kernel using additional statistical constraints.We also present a simple deconvolution technique that preserves edges while minimizing the ringing artifacts in the restored latent image. The experimental results prove that our technique is able to infer accurately the blur kernel while reducing significantly the artifacts of the spoilt images.
  • Item
    Interactive Rendering of Translucent Objects?
    (Blackwell Publishers, Inc and the Eurographics Association, 2003) Lensch, Hendrik P.A.; Goesele, Michael; Bekaert, Philippe; Kautz, Jan; Magnor, Marcus A. and Lang, Jochen and Seidel, Hans-Peter
    This paper presents a rendering method for translucent objects, in which viewpoint and illumination can be modified at interactive rates. In a preprocessing step, the impulse response to incoming light impinging at each surface point is computed and stored in two different ways: The local effect on close-by surface points is modeled as a per-texel filter kernel that is applied to a texture map representing the incident illumination. The global response (i.e. light shining through the object) is stored as vertex-to-vertex throughput factors for the triangle mesh of the object. During rendering, the illumination map for the object is computed according to the current lighting situation and then filtered by the precomputed kernels. The illumination map is also used to derive the incident illumination on the vertices which is distributed via the vertex-to-vertex throughput factors to the other vertices. The final image is obtained by combining the local and global response. We demonstrate the performance of our method for several models.ACM CSS:I.3.7 Computer Graphics-Three-Dimensional Graphics and Realism Color Radiosity
  • Item
    Radiosity with Well Distributed Ray Sets
    (Blackwell Publishers Ltd and the Eurographics Association, 1997) Neumann, Laszlo; Neumann, Attila; Bekaert, Philippe
    In this paper we present a new radiosity algorithm, based on the notion of a well distributed ray set (WDRS). A WDRS is a set of rays, connecting mutually visible points and patches, that forms an approximate representation of the radiosity operator and the radiosity distribution. We propose an algorithm that constructs an optimal WDRS for a given accuracy and mesh. The construction is based on discrete importance sampling as in previously proposed stochastic radiosity algorithms, and on quasi Monte Carlo sampling. Quasi Monte Carlo sampling leads to faster convergence rates and the fact that the sampling is deterministic makes it possible to represent the well distributed ray set very efficiently in computer memory. Like previously proposed stochastic radiosity algorithms, the new algorithm is well suited for computing the radiance distribution in very complex diffuse scenes, when it is not feasible to explicitly compute and store form factors as in classical radiosity algorithms. Experiments show that the new algorithm is often more efficient than previously proposed Monte Carlo radiosity algorithms by half an order of magnitude and more.
  • Item
    An Information Theory Framework for the Analysis of Scene Complexity
    (Blackwell Publishers Ltd and the Eurographics Association, 1999) Feixas, Miquel; Del Acebo, Esteve; Bekaert, Philippe; Sbert, Mateu
    In this paper we present a new framework for the analysis of scene visibility and radiosity complexity. We introduce a number of complexity measures from information theory quantifying how difficult it is to compute with accuracy the visibility and radiosity in a scene. We define the continuous mutual information as a complexity measure of a scene, independent of whatever discretisation, and discrete mutual information as the complexity of a discretised scene. Mutual information can be understood as the degree of correlation or dependence between all the points or patches of a scene. Thus, low complexity corresponds to low correlation and vice versa. Experiments illustrating that the best mesh of a given scene among a number of alternatives corresponds to the one with the highest discrete mutual information, indicate the feasibility of the approach. Unlike continuous mutual information, which is very cheap to compute, the computation of discrete mutual information can however be quite demanding. We will develop cheap complexity measure estimates and derive practical algorithms from this framework in future work.