Eurographics Digital Library

This is the DSpace 7 platform of the Eurographics Digital Library.
  • The contents of the Eurographics Digital Library Archive are freely accessible. Only access to the full-text documents of the journal Computer Graphics Forum (joint property of Wiley and Eurographics) is restricted to Eurographics members, people from institutions who have an Institutional Membership at Eurographics, or users of the TIB Hannover. On the item pages you will find so-called purchase links to the TIB Hannover.
  • As a Eurographics member, you can log in with your email address and password from https://services.eg.org. If you are part of an institutional member and you are on a computer with a Eurographics registered IP domain, you can proceed immediately.
  • From 2022, all new releases published by Eurographics will be licensed under Creative Commons. Publishing with Eurographics is Plan-S compliant. Please visit Eurographics Licensing and Open Access Policy for more details.
 

Recent Submissions

Item
Correction to 'Antarstick: Extracting Snow Height From Time-Lapse Photography'
(The Eurographics Association and John Wiley & Sons Ltd., 2025) Lang, M.; Mráz, R.; Trtík, M.; Stoppel, S.; Byška, J.; Kozlíková, B.; Wimmer, Michael; Alliez, Pierre; Westermann, Rüdiger
Correction note to the article "Antarstick: Extracting Snow Height From Time-Lapse Photography".
Item
Self-Supervised Image Harmonization via Region-Aware Harmony Classification
(The Eurographics Association and John Wiley & Sons Ltd., 2025) Tian, Chenyang; Wang, Xinbo; Zhang, Qing; Wimmer, Michael; Alliez, Pierre; Westermann, Rüdiger
Image harmonization is a widely used technique in image composition, which aims to adjust the appearance of the composited foreground object according to the style of the background image so that the resulting composited image is visually natural and appears to be photographed. Previous methods are mostly trained in a fully supervised manner, while demonstrating promising results, they do not generalize well to complex unseen cases involving significant style and semantic difference between the composited foreground object and the background image. In this paper, we present a self-supervised image harmonization framework that enables superior performance on complex cases. To do so, we first synthesize a large amount of data with wide diversity for training. We then develop an attentive harmonization module to adaptively adjust the foreground appearance by querying relevant background features. To allow more effective image harmonization, we develop a region-aware harmony classifier to explicitly judge whether an image is harmonious or not. Experiments on several datasets show that our method performs favourably against previous methods. Our code will be made publicly available.
Item
GeoDEN: A Visual Exploration Tool for Analyzing the Geographic Spread of Dengue Serotypes
(The Eurographics Association and John Wiley & Sons Ltd., 2025) Marler, Aidan; Roell, Yannik; Knoblauch, Steffen; Messina Jane, P.; Jaenisch, Thomas; Karimzadeh, Mohammad; Wimmer, Michael; Alliez, Pierre; Westermann, Rüdiger
Static maps and animations remain popular in spatial epidemiology of dengue, limiting the analytical depth and scope of visualizations. Over half of the global population live in dengue endemic regions. Understanding the spatiotemporal dynamics of the four closely related dengue serotypes, and their immunological interactions, remains a challenge at a global scale. To facilitate this understanding, we worked with dengue epidemiologists in a user-centred design framework to create GeoDEN, an exploratory visualization tool that empowers experts to investigate spatiotemporal patterns in dengue serotype reports. The tool has several linked visualizations and filtering mechanisms, enabling analysis at a range of spatial and temporal scales. To identify successes and failures, we present both insight-based and value-driven evaluations. Our domain experts found GeoDEN valuable, verifying existing hypotheses and uncovering novel insights that warrant further investigation by the epidemiology community. The developed visual exploration approach can be adapted for exploring other epidemiology and disease incident datasets.
Item
Herds From Video: Learning a Microscopic Herd Model From Macroscopic Motion Data
(The Eurographics Association and John Wiley & Sons Ltd., 2025) Gong, Xianjin; Gain, James; Rohmer, Damien; Lyonnet, Sixtine; Pettré, Julien; Cani, Marie-Paule; Wimmer, Michael; Alliez, Pierre; Westermann, Rüdiger
We present a method for animating herds that automatically tunes a microscopic herd model based on a short video clip of real animals. Our method handles videos with dense herds, where individual animal motion cannot be separated out. Our contribution is a novel framework for extracting macroscopic herd behaviour from such video clips, and then deriving the microscopic agent parameters that best match this behaviour. To support this learning process, we extend standard agent models to provide a separation between leaders and followers, better match the occlusion and field-of-view limitations of real animals, support differentiable parameter optimization and improve authoring control. We validate the method by showing that once optimized, the social force and perception parameters of the resulting herd model are accurate enough to predict subsequent frames in the video, even for macroscopic properties not directly incorporated in the optimization process. Furthermore, the extracted herding characteristics can be applied to any terrain with a palette and region-painting approach that generalizes to different herd sizes and leader trajectories. This enables the authoring of herd animations in new environments while preserving learned behaviour.
Item
Real-Time and Controllable Reactive Motion Synthesis via Intention Guidance
(The Eurographics Association and John Wiley & Sons Ltd., 2025) Zhang, Xiaotang; Chang, Ziyi; Men, Qianhui; Shum, Hubert P. H.; Wimmer, Michael; Alliez, Pierre; Westermann, Rüdiger
We propose a real-time method for reactive motion synthesis based on the known trajectory of an input character, predicting instant reactions using only historical, user-controlled motions. Our method handles the uncertainty of future movements by introducing an intention predictor, which forecasts key joint intentions to make pose prediction more deterministic from the historical interaction. The intention is later encoded into the latent space of its reactive motion, matched with a codebook that represents mappings between input and output. It samples from the categorical distribution for pose generation and strengthens model robustness through adversarial training. Unlike previous offline approaches, the system can recursively generate intentions and reactive motions using feedback from earlier steps, enabling real-time, long-term realistic interactive synthesis. Both quantitative and qualitative experiments show our approach outperforms other matching-based motion synthesis approaches, delivering superior stability and generalisability. In our method, the user can also actively influence the outcome by controlling the moving directions, creating a personalised interaction path that deviates from predefined trajectories.