7 results
Search Results
Now showing 1 - 7 of 7
Item Reactive Gaze during Locomotion in Natural Environments(The Eurographics Association and John Wiley & Sons Ltd., 2024) Melgaré, Julia K.; Rohmer, Damien; Musse, Soraia R.; Cani, Marie-Paule; Skouras, Melina; Wang, HeAnimating gaze behavior is crucial for creating believable virtual characters, providing insights into their perception and interaction with the environment. In this paper, we present an efficient yet natural-looking gaze animation model applicable to real-time walking characters exploring natural environments. We address the challenge of dynamic gaze adaptation by combining findings from neuroscience with a data-driven saliency model. Specifically, our model determines gaze focus by considering the character's locomotion, environment stimuli, and terrain conditions. Our model is compatible with both automatic navigation through pre-defined character trajectories and user-guided interactive locomotion, and can be configured according to the desired degree of visual exploration of the environment. Our perceptual evaluation shows that our solution significantly improves the state-of-the-art saliency-based gaze animation with respect to the character's apparent awareness of the environment, the naturalness of the motion, and the elements to which it pays attention.Item Unerosion: Simulating Terrain Evolution Back in Time(The Eurographics Association and John Wiley & Sons Ltd., 2024) Yang, Zhanyu; Cordonnier, Guillaume; Cani, Marie-Paule; Perrenoud, Christian; Benes, Bedrich; Skouras, Melina; Wang, HeWhile the past of terrain cannot be known precisely because an effect can result from many different causes, exploring these possible pasts opens the way to numerous applications ranging from movies and games to paleogeography. We introduce unerosion, an attempt to recover plausible past topographies from an input terrain represented as a height field. Our solution relies on novel algorithms for the backward simulation of different processes: fluvial erosion, sedimentation, and thermal erosion. This is achieved by re-formulating the equations of erosion and sedimentation so that they can be simulated back in time. These algorithms can be combined to account for a succession of climate changes backward in time, while the possible ambiguities provide editing options to the user. Results show that our solution can approximately reverse different types of erosion while enabling users to explore a variety of alternative pasts. Using a chronology of climatic periods to inform us about the main erosion phenomena, we also went back in time using real measured terrain data. We checked the consistency with geological findings, namely the height of river beds hundreds of thousands of years ago.Item Generating Upper-Body Motion for Real-Time Characters Making their Way through Dynamic Environments(The Eurographics Association and John Wiley & Sons Ltd., 2022) Alvarado, Eduardo; Rohmer, Damien; Cani, Marie-Paule; Dominik L. Michels; Soeren PirkReal-time character animation in dynamic environments requires the generation of plausible upper-body movements regardless of the nature of the environment, including non-rigid obstacles such as vegetation. We propose a flexible model for upper-body interactions, based on the anticipation of the character's surroundings, and on antagonistic controllers to adapt the amount of muscular stiffness and response time to better deal with obstacles. Our solution relies on a hybrid method for character animation that couples a keyframe sequence with kinematic constraints and lightweight physics. The dynamic response of the character's upper-limbs leverages antagonistic controllers, allowing us to tune tension/relaxation in the upper-body without diverging from the reference keyframe motion. A new sight model, controlled by procedural rules, enables high-level authoring of the way the character generates interactions by adapting its stiffness and reaction time. As results show, our real-time method offers precise and explicit control over the character's behavior and style, while seamlessly adapting to new situations. Our model is therefore well suited for gaming applications.Item Volcanic Skies: Coupling Explosive Eruptions with Atmospheric Simulation to Create Consistent Skyscapes(The Eurographics Association and John Wiley & Sons Ltd., 2024) Pretorius, Pieter C.; Gain, James; Lastic, Maud; Cordonnier, Guillaume; Chen, Jiong; Rohmer, Damien; Cani, Marie-Paule; Bermano, Amit H.; Kalogerakis, EvangelosExplosive volcanic eruptions rank among the most terrifying natural phenomena, and are thus frequently depicted in films, games, and other media, usually with a bespoke once-off solution. In this paper, we introduce the first general-purpose model for bi-directional interaction between the atmosphere and a volcano plume. In line with recent interactive volcano models, we approximate the plume dynamics with Lagrangian disks and spheres and the atmosphere with sparse layers of 2D Eulerian grids, enabling us to focus on the transfer of physical quantities such as temperature, ash, moisture, and wind velocity between these sub-models. We subsequently generate volumetric animations by noise-based procedural upsampling keyed to aspects of advection, convection, moisture, and ash content to generate a fully-realized volcanic skyscape. Our model captures most of the visually salient features emerging from volcano-sky interaction, such as windswept plumes, enmeshed cap, bell and skirt clouds, shockwave effects, ash rain, and sheathes of lightning visible in the dark.Item Generating Flight Summaries Conforming to Cinematographic Principles(The Eurographics Association and John Wiley & Sons Ltd., 2024) Lino, Christophe; Cani, Marie-Paule; Skouras, Melina; Wang, HeWe propose an automatic method for generating flight summaries of prescribed duration, given any planed 3D trajectory of a flying object. The challenge is to select relevant time-ellipses, while keeping and adequately framing the most interesting parts of the trajectory, and enforcing cinematographic rules between the selected shots. Our solution optimizes the visual quality of the output video both in terms of camera view and film editing choices, thanks to a new optimization technique, designed to jointly optimize the selection of the interesting parts of a flight, and the camera animation parameters over time. To our best knowledge, this solution is the first one to address camera control, film editing, and trajectory summarizing at once. Ablation studies demonstrate the visual quality of the flights summaries we generate compared to alternative methods.Item VRSurf: Surface Creation from Sparse, Unoriented 3D Strokes(The Eurographics Association and John Wiley & Sons Ltd., 2025) Sureshkumar, Anandhu; Parakkat, Amal Dev; Bonneau, Georges-Pierre; Hahmann, Stefanie; Cani, Marie-Paule; Bousseau, Adrien; Day, AngelaAlthough intuitive, sketching a closed 3D shape directly in an immersive environment results in an unordered set of arbitrary strokes, which can be difficult to assemble into a closed surface. We tackle this challenge by introducing VRSurf, a surfacing method inspired by a balloon inflation metaphor: Seeded in the sparse scaffold formed by the strokes, a smooth, closed surface is inflated to progressively interpolate the input strokes, sampled into lists of points. These are treated in a divide-and-conquer manner, which allows for automatically triggering some additional balloon inflation followed by fusion if the current inflation stops due to a detected concavity. While the input strokes are intended to belong to the same smooth 3D shape, our method is robust to coarse VR input and does not require strokes to be aligned. We simply avoid intersecting strokes that might give an inconsistent surface position due to the roughness of the VR drawing. Moreover, no additional topological information is required, and all the user needs to do is specify the initial seeding location for the first balloon. The results show that VRsurf can efficiently generate smooth surfaces that interpolate sparse sets of unoriented strokes. Validation includes a side-by-side comparison with other reconstruction methods on the same input VR sketch. We also check that our solution matches the user's intent by applying it to strokes that were sketched on an existing 3D shape and comparing what we get to the original one.Item ReConForM: Real-time Contact-aware Motion Retargeting for more Diverse Character Morphologies(The Eurographics Association and John Wiley & Sons Ltd., 2025) Cheynel, Théo; Rossi, Thomas; Bellot-Gurlet, Baptiste; Rohmer, Damien; Cani, Marie-Paule; Bousseau, Adrien; Day, AngelaPreserving semantics, in particular in terms of contacts, is a key challenge when retargeting motion between characters of different morphologies. Our solution relies on a low-dimensional embedding of the character's mesh, based on rigged key vertices that are automatically transferred from the source to the target. Motion descriptors are extracted from the trajectories of these key vertices, providing an embedding that contains combined semantic information about both shape and pose. A novel, adaptive algorithm is then used to automatically select and weight the most relevant features over time, enabling us to efficiently optimize the target motion until it conforms to these constraints, so as to preserve the semantics of the source motion. Our solution allows extensions to several novel use-cases where morphology and mesh contacts were previously overlooked, such as multi-character retargeting and motion transfer on uneven terrains. As our results show, our method is able to achieve real-time retargeting onto a wide variety of characters. Extensive experiments and comparison with state-of-the-art methods using several relevant metrics demonstrate improved results, both in terms of motion smoothness and contact accuracy.