36 results
Search Results
Now showing 1 - 10 of 36
Item A High-Dimensional Data Quality Metric using Pareto Optimality(The Eurographics Association, 2017) Post, Tobias; Wischgoll, Thomas; Hamann, Bernd; Hagen, Hans; Anna Puig Puig and Tobias IsenbergThe representation of data quality within established high-dimensional data visualization techniques such as scatterplots and parallel coordinates is still an open problem. This work offers a scale-invariant measure based on Pareto optimality that is able to indicate the quality of data points with respect to the Pareto front. In cases where datasets contain noise or parameters that cannot easily be expressed or evaluated mathematically, the presented measure provides a visual encoding of the environment of a Pareto front to enable an enhanced visual inspection.Item Towards Closing the Gap of Medical Visualization Research and Clinical Daily Routine(The Eurographics Association, 2020) Maack, Robin Georg Claus; Saur, Dorothee; Hagen, Hans; Scheuermann, Gerik; Gillman, Christina; Gillmann, Christina and Krone, Michael and Reina, Guido and Wischgoll, ThomasMedical visualization papers are constantly published throughout the last years, but many never make their way into clinical daily routine. In this manuscript we aim to examine the gap between visualization research and clinical daily routine and suggest a mechanism that can lead towards closing this gap. We first identify the actors involved in developing new medical visualization approaches and their different views in this process. Then we develop a software development process unifying all actors and their needs. In addition, we collect further barriers in the medical software development process.Item VafusQ: A Visual Analytics Application with Data Quality Features to Support the Urban Planning Process(The Eurographics Association, 2015) Triana, John A.; Zeckzer, Dirk; Hernandez, Jose T.; Hagen, Hans; A. Middel and K. Rink and G. H. WeberFast changing urban systems pose huge challenges for planners and governments. One major challenge is to provide optimized facilities systems fulfilling all the basic citizen needs such as food, education, security, and health. To provide these, the deficit of the complete system needs to be analyzed and quantified. An additional, important problem is the quality of the underlying data influencing the analysis. Often, the data is, e.g., incomplete, not accurate, or not reliable. The goal of this paper is to support the analysis of the deficit for the facilities system of Bogotá by taking into account data quality issues. Our contributions are: the inclusion of data quality in the urban planning process, the design of a novel visualization technique to represent data quality, the implementation of an application to support the analysis of the facilities system, and a case study with experts assessing the usability and usefulness of the application. As a conclusion, the experts find the application useful for the analysis tasks and the inclusion of data quality features important and comprehensible.Item Subdivision Surfaces for Scattered-data Approximation(The Eurographics Association, 2001) Bertram, Martin; Hagen, Hans; David S. Ebert and Jean M. Favre and Ronald PeikertWe propose a modified Loop subdivision surface scheme for the approximation of scattered data in the plane. Starting with a triangulated set of scattered data with associated function values, our scheme applies linear, stationary subdivision rules resulting in a hierarchy of triangulations that converge rapidly to a smooth limit surface. The novelty of our scheme is that it applies subdivision only to the ordinates of control points, whereas the triangulated mesh in the plane is fixed. Our subdivision scheme defines locally supported, bivariate basis functions and provides multiple levels of approximation with triangles. We use our subdivision scheme for terrain modeling.Item In Situ and Post Processing Workflows for Asteroid Ablation Studies(The Eurographics Association, 2017) Patchett, John M.; Nouanesengsy, Boonthanome; Gisler, Galen; Ahrens, James; Hagen, Hans; Barbora Kozlikova and Tobias Schreck and Thomas WischgollSimulation scientists need to make decisions about what and how much output to produce. They must balance their ability to efficiently ingest the analysis with their ability to get more analysis. We study this balance as a tradeoff between flexibility of saved data products and accessibility of saved data products. One end of the spectrum is raw data that comes directly from the simulation, making it highly flexible, but inaccessible due to its size and format. The other end of the spectrum is highly processed and comparatively small data, often in the form of imagery or single scalar values. This data is typically highly accessible, needing no special equipment or software, but lacks flexibility for deeper analysis than what is presented. We lay out a user driven model that considers the scientists' output needs in regards to flexibility and accessibility. This model allows us to analyze a real-world example of a large simulation lasting months of wall clock time on thousands of processing cores. Though the ensemble of simulation's original intent was to study asteroid generated tsunamis, the simulations are now being used beyond that scope to study the asteroid ablation as it moves through the atmosphere. With increasingly large supercomputers, designing workflows that support an intentional and understood balance of flexibility and accessibility is necessary. In this paper, we present a new strategy developed from a user driven perspective to support the collaborative capability between simulation developers, designers, users and analysts to effectively support science by wisely using both computer and human time.Item A Survey of Topology-based Methods in Visualization(The Eurographics Association and John Wiley & Sons Ltd., 2016) Heine, Christian; Leitte, Heike; Hlawitschka, Mario; Iuricich, Federico; Floriani, Leila De; Scheuermann, Gerik; Hagen, Hans; Garth, Christoph; Ross Maciejewski and Timo Ropinski and Anna VilanovaThis paper presents the state of the art in the area of topology-based visualization. It describes the process and results of an extensive annotation for generating a definition and terminology for the field. The terminology enabled a typology for topological models which is used to organize research results and the state of the art. Our report discusses relations among topological models and for each model describes research results for the computation, simplification, visualization, and application. The paper identifies themes common to subfields, current frontiers, and unexplored territory in this research area.Item Volume Deformations in Grid-Less Flow Simulations(The Eurographics Association and Blackwell Publishing Ltd., 2009) Obermaier, Harald; Hering-Bertram, Martin; Kuhnert, Jörg; Hagen, Hans; H.-C. Hege, I. Hotz, and T. MunznerAbstract This paper presents a novel method for the extraction and visualization of volume deformations in grid-less point based flow simulations. Our primary goals are the segmentation of different paths through a mixing device and the visualization of ellipsoidal particle deformations. The main challenges are the numerically efficient processing of deformation tensors and the robust integration of stream- and streaklines at boundaries of the dataset such that closed segments are obtained. Our results show two- and three-dimensional particle deformations as well as the segmentation of volumes in stationary fields and areas in time-dependent datasets taking consistent paths through a mixing device.Item Extraction of Crack-free Isosurfaces from Adaptive Mesh Refinement Data(The Eurographics Association, 2001) Weber, Gunther H.; Kreylos, Oliver; Ligocki, Terry J.; Shalf, John M.; Hagen, Hans; Hamann, Bernd; Joy, Kenneth I.; David S. Ebert and Jean M. Favre and Ronald PeikertAdaptive mesh refinement (AMR) is a numerical simulation technique used in computational fluid dynamics (CFD). It permits the efficient simulation of phenomena characterized by substantially varying scales in complexity of local behavior of certain variables. By using a set of nested grids at different resolutions, AMR combines the simplicity of structured rectilinear grids with the possibility to adapt to local changes in complexity and spatial resolution. Hierarchical representations of scientific data pose challenges when isosurfaces are extracted. Cracks can arise at the boundaries between regions represented at different resolutions. We present a method for the extraction of isosurfaces from AMR data that avoids cracks at the boundaries between levels of different resolution.Item Panning for Insight: Amplifying Insight through Tight Integration of Machine Learning, Data Mining, and Visualization(The Eurographics Association, 2018) Karer, Benjamin; Scheler, Inga; Hagen, Hans; Ian Nabney and Jaakko Peltonen and Daniel ArchambaultWith the rapid progress made in Data Mining, Visualization, and Machine Learning during the last years, combinations of these methods have gained increasing interest. This paper summarizes ideas behind ongoing work on combining methods of these three domains into an insight-driven interactive data analysis workflow. Based on their interpretation of data visualizations, users generate metadata to be fed back into the analysis. The resulting resonance effect improves the performance of subsequent analysis. The paper outlines the ideas behind the workflow, indicates the benefits and discusses how to avoid potential pitfalls.Item Towards High-dimensional Data Analysis in Air Quality Research(The Eurographics Association and Blackwell Publishing Ltd., 2013) Engel, Daniel; Hummel, Mathias; Hoepel, Florian; Bein, Keith; Wexler, Anthony; Garth, Christoph; Hamann, Bernd; Hagen, Hans; B. Preim, P. Rheingans, and H. TheiselAnalysis of chemical constituents from mass spectrometry of aerosols involves non-negative matrix factorization, an approximation of high-dimensional data in lower-dimensional space. The associated optimization problem is non-convex, resulting in crude approximation errors that are not accessible to scientists. To address this shortcoming, we introduce a new methodology for user-guided error-aware data factorization that entails an assessment of the amount of information contributed by each dimension of the approximation, an effective combination of visualization techniques to highlight, filter, and analyze error features, as well as a novel means to interactively refine factorizations. A case study and the domain-expert feedback provided by the collaborating atmospheric scientists illustrate that our method effectively communicates errors of such numerical optimization results and facilitates the computation of high-quality data factorizations in a simple and intuitive manner.