Search Results

Now showing 1 - 6 of 6
  • Item
    Real-Time Bump Map Synthesis
    (The Eurographics Association, 2001) Kautz, Jan; Heidrich, Wolfgang; Seidel, Hans-Peter; Kurt Akeley and Ulrich Neumann
    In this paper we present a method that automatically synthesizes bump maps at arbitrary levels of detail in real-time. The only input data we require is a normal density function; the bump map is generated according to that function. It is also used to shade the generated bump map. The technique allows to infinitely zoom into the surface, because more (consistent) detail can be created on the fly. The shading of such a surface is consistent when displayed at different distances to the viewer (assuming that the surface structure is self-similar). The bump map generation and the shading algorithm can also be used separately.
  • Item
    Image-Based Reconstruction of Spatially Varying Materials
    (The Eurographics Association, 2001) Lensch, Hendrik P. A.; Kautz, Jan; Goesele, Michael; Heidrich, Wolfgang; Seidel, Hans-Peter; S. J. Gortle and K. Myszkowski
    The measurement of accurate material properties is an important step towards photorealistic rendering. Many real-world objects are composed of a number of materials that often show subtle changes even within a single material. Thus, for photorealistic rendering both the general surface properties as well as the spatially varying effects of the object are needed. We present an image-based measuring method that robustly detects the different materials of real objects and fits an average bidirectional reflectance distribution function (BRDF) to each of them. In order to model the local changes as well, we project the measured data for each surface point into a basis formed by the recovered BRDFs leading to a truly spatially varying BRDF representation. A high quality model of a real object can be generated with relatively few input data. The generated model allows for rendering under arbitrary viewing and lighting conditions and realistically reproduces the appearance of the original object.
  • Item
    Thrifty Final Gather for Radiosity
    (The Eurographics Association, 2001) Scheel, Annette; Stamminger, Marc; Seidel, Hans-Peter; S. J. Gortle and K. Myszkowski
    Finite Element methods are well suited to the computation of the light distribution in mostly diffuse scenes, but the resulting mesh is often far from optimal to accurately represent illumination. Shadow boundaries are hard to capture in the mesh, and the illumination may contain artifacts due to light transports at different mesh hierarchy levels. To render a high quality image a costly final gather reconstruction step is usually done, which re-evaluates the illumination integral for each pixel. In this paper an algorithm is presented which significantly speeds up the final gather by exploiting spatial and directional coherence information taken from the radiosity solution. Senders are classified, so that their contribution to a pixel is either interpolated from the radiosity solution or recomputed with an appropriate number of new samples. By interpolating this sampling pattern over the radiosity mesh, continuous solutions are obtained.
  • Item
    Perceptually Guided Corrective Splatting
    (Blackwell Publishers Ltd and the Eurographics Association, 2001) Haber, Jorg; Myszkowski, Karol; Yamauchi, Hitoshi; Seidel, Hans-Peter
    One of the basic difficulties with interactive walkthroughs is the high quality rendering of object surfaces with non-diffuse light scattering characteristics. Since full ray tracing at interactive rates is usually impossible, we render a precomputed global illumination solution using graphics hardware and use remaining computational power to correct the appearance of non-diffuse objects on-the-fly. The question arises, how to obtain the best image quality as perceived by a human observer within a limited amount of time for each frame. We address this problem by enforcing corrective computation for those non-diffuse objects that are selected using a computational model of visual attention. We consider both the saliency- and task-driven selection of those objects and benefit from the fact that shading artifacts of "unattended" objects are likely to remain unnoticed. We use a hierarchical image-space sampling scheme to control ray tracing and splat the generated point samples. The resulting image converges progressively to a ray traced solution if the viewing parameters remain unchanged. Moreover, we use a sample cache to enhance visual appearance if the time budget for correction has been too low for some frame. We check the validity of the cached samples using a novel criterion suited for non-diffuse surfaces and reproject valid samples into the current view.
  • Item
    Efficient Cloth Modeling and Rendering
    (The Eurographics Association, 2001) Daubert, Katja; Lensch, Hendrik P. A.; Heidrich, Wolfgang; Seidel, Hans-Peter; S. J. Gortle and K. Myszkowski
    Realistic modeling and high-performance rendering of cloth and clothing is a challenging problem. Often these materials are seen at distances where individual stitches and knits can be made out and need to be accounted for. Modeling of the geometry at this level of detail fails due to sheer complexity, while simple texture mapping techniques do not produce the desired quality. In this paper, we describe an efficient and realistic approach that takes into account view-dependent effects such as small displacements causing occlusion and shadows, as well as illumination effects. The method is efficient in terms of memory consumption, and uses a combination of hardware and software rendering to achieve high performance. It is conceivable that future graphics hardware will be flexible enough for full hardware rendering of the proposed method.
  • Item
    On-the-Fly Processing of Generalized Lumigraphs
    (Blackwell Publishers Ltd and the Eurographics Association, 2001) Schirmacher, Hartmut; Ming, Li; Seidel, Hans-Peter
    We introduce a flexible and powerful concept for reconstructing arbitrary views from multiple source images on the fly. Our approach is based on a Lumigraph structure with per-pixel depth values, and generalizes the classical two-plane parameterized light fields and Lumigraphs. With our technique, it is possible to render arbitrary views of time-varying, non-diffuse scenes at interactive frame rates, and it allows using any kind of sensor that yields images with dense depth information. We demonstrate the flexibility and efficiency of our approach through various examples.