10 results
Search Results
Now showing 1 - 10 of 10
Item Using temporal and spatial coherence for accelerating the calculation of animation sequences(Eurographics Association, 1991) Gröller, Eduard; Purgathofer, WernerRay tracing is a well known technique for generating realistic images. One of the major drawbacks of this approach are the extensive computational requirements for image calculation. When generating animation sequences frame by frame the computational cost might easily become intolerable. In the last years several methods have been devised for accelerating the computational speed by using spatial and temporal coherence. While these techniques work only under certain restrictions, a new approach is presented in this paper which leads to a considerable speed-up of the calculation process without putting any limitations on camera or object movement. In principle, the method is an extension of /ArKi87/, where rays are considered points in 5D space, by the time dimension. CSG is used for object description and has been modified correspondingly to allow easy use of coherence properties. The paper describes the theoretical background and the main concepts of a practical implementation.Item Open Issues in Photo-realistic Rendering(Eurographics Association, 2003) Purgathofer, WernerFor more than two decades Computer graphics researchers have tried to achieve photo-realism in their images as reliable as possible, mainly by simulating the physical laws of light and adding one effect after the other. The recent years have brought a change of efforts towards real-time methods, easy-to-use systems, integration with vision, modelling tools and the like. The quality of images is mostly accepted as sufficient for real world applications, but where are we really? There are still numerous problems to be solved, and there is notable progress in these areas. No question, the plug-in philosophy of some commercial products has enabled several of these new techniques to be distributed quite fast. But unfortunately, many other of these developments happen in isolated systems for the pure purpose of publication, and never make it into commercial software. This presentation wants to make people more aware of such activities, and evaluate the steps we still have to go towards perfect photo-realism. The talk will start with an attempt to give a brief overview of the rendering history, highlighting the main research directions at different times. It will explain the driving forces of the developments, which are complexity, speed, and accuracy, and maybe also expression in recent years. Solved and unsolved areas are examined, and compared to practically solved but theoretically incomplete topics such as translucency, tone mapping, light source and BTF descriptions, and error metrics for image quality evaluation. The difference lies mainly in the difference between believable, correct, and predictive images. Also, for really realistic images modelling complexity is still an issue. Finally, some recent work on polarization and fluorescence is presented.Item A Statistical Method For Adaptive Stochastic Sampling(The Eurographics Association, 1986) Purgathofer, Werner; A.A.G. RequichaStochastic sampling is a good alternative to pure oversampling in terms of image quality. A method for adaptively controlling the number of required samples to the complexity of the picture is presented. The quality of the obtained picture can be controlled by two well-understandable parameters, these parameters define an error interval sire and the probability that a pixel lies within it. The usefulness of the method is described by applying it to distributed ray-tracing.Item A Median Cut Algorithm for Efficient Sampling of Radiosity Functions(Blackwell Science Ltd and the Eurographics Association, 1994) Feda, Martin; Purgathofer, WernerThis paper presents an efficient method for sampling the illumination functions in higher order radiosity algorithms. In such algorithms, the illumination function is not assumed to be constant across each patch, but it is approximated by a function which is at least C1 continuous. Our median cut sampling algorithm is inspired by the observation that many form factors are computed at higher precision than is necessary. While a high sampling rate is necessary in regions of high illumination, dark areas can be sampled at a much lower rate to compute the received radiosity within a given precision. We adaptively subdivide the emitter into regions of approximately equal influence on the result. Form factors are evaluated by the disk approximation and a ray tracing based test for occlusion detection. The implementation of a higher order radiosity system using B-splines as radiosity function is described. The median cut algorithm can also be used for radiosity algorithms based on the constant radiosity assumption.Item Verification of Physically Based Rendering Algorithms(The Eurographics Association, 2005) Ulbricht, Christiane; Wilkie, Alexander; Purgathofer, Werner; Yiorgos Chrysanthou and Marcus MagnorWithin computer graphics, the field of predictive rendering is concerned with those methods of image synthesis which yield results that do not only look real, but are also radiometrically correct renditions of nature, i.e. which are accurate predictions of what a real scene would look like under given lighting conditions. In order to guarantee the correctness of the results obtained by such techniques, three stages of such a rendering system have to be verified with particular care: the light reflection models, the light transport simulation and the perceptually based calculations used at display time. In this report, we will concentrate on the state of the art with respect to the second step in this chain. Various approaches for experimental verification of the implementation of a physically based rendering system have been proposed so far. However, the problem of proving that the results are correct is not fully solved yet, and no standardized methodology is available.We give an overview of existing literature, discuss the strengths and weaknesses of the described methods and illustrate the unsolved problems. We also briefly discuss the related issue of image quality metrics.Item Image-based Representations for Accelerated Rendering of Complex Scenes(The Eurographics Association, 2005) Jeschke, Stefan; Wimmer, Michael; Purgathofer, Werner; Yiorgos Chrysanthou and Marcus MagnorThis paper gives an overview of image-based representations commonly used for reducing the geometric complexity of a scene description in order to accelerate the rendering process. Several different types of representations and ways for using them have been presented, which are classified and discussed here. Furthermore, the overview includes techniques for accelerating the rendering of static scenes or scenes with animations and/or dynamic lighting effects. The advantages and drawbacks of the different approaches are illuminated, and unsolved problems and roads for further research are shown.Item Deformation of Solids with Trivariate B-Splines(Eurographics Association, 1989) Griessmair, Josef; Purgathofer, WernerSolid geometric models can be deformed to free-form solids by the use of trivariate B-splines. This paper describes the problems of implementing such transformations for shaded rendering. The surfaces are subdivided into triangles adaptively so that the error in image space is limited. This adaptive triangulation ensures a smooth appearance of the resulting pictures.Item Occlusion Culling Methods(Eurographics Association, 2001) Hey, Heinrich; Purgathofer, WernerItem Radiosity for Large Vegetation Scenes(Eurographics Association, 1999) Mastal, Helmut; Tobler, Robert F.; Purgathofer, WernerCalculating radiosity solutions for large scenes containing multiple plants is all but impossible using the radiosity method in its original form. With the introduction of sophisticated hierarchical and clustering algorithms radiosity for vegetation scenes becomes a solvable challenge. The precomputation of the diffuse light distribution in leaf canopies of forests and other plants can be used to calculate realistic images, but also for agricultural planning purposes. This state of the art report gives an overview of the methods that can, and have been, used to calculate global illumination in vegetation scenes, including hierarchical methods, statistical methods based on simplifications, and specialized methods that have been optimized to handle scenes with a dense, non-isotropic distribution of objects such as canopies.Item Tone Reproduction and Physically Based Spectral Rendering(Eurographics Association, 2002) Devlin, Kate; Chalmers, Alan; Wilkie, Alexander; Purgathofer, WernerThe ultimate aim of realistic graphics is the creation of images that provoke the same responses that a viewer would have to a real scene. This STAR addresses two related key problem areas in this effort which are located at opposite ends of the rendering pipeline, namely the data structures used to describe light during the actual rendering process, and the issue of displaying such radiant intensities in a meaningful way. The interest in the first of these subproblems stems from the fact that it is common industry practice to use RGB colour values to describe light intensity and surface reflectancy. While viable in the context of methods that do not strive to achieve true realism, this approach has to be replaced by more physically accurate techniques if a prediction of nature is intended. The second subproblem is that while research into ways of rendering images provides us with better and faster methods, we do not necessarily see their full effect due to limitations of the display hardware. The low dynamic range of a standard computer monitor requires some form of mapping to produce images that are perceptually accurate. Tone reproduction operators attempt to replicate the effect of real-world luminance intensities. This STAR report will review the work to date on spectral rendering and tone reproduction techniques. It will include an investigation into the need for spectral imagery synthesis methods and accurate tone reproduction, and a discussion of major approaches to physically correct rendering and key tone mapping algorithms. The future of both spectral rendering and tone reproduction techniques will be considered, together with the implications of advances in display hardware.