Melgaré, Julia K.Rohmer, DamienMusse, Soraia R.Cani, Marie-PauleSkouras, MelinaWang, He2024-08-202024-08-2020241467-8659https://doi.org/10.1111/cgf.15168https://diglib.eg.org/handle/10.1111/cgf15168Animating gaze behavior is crucial for creating believable virtual characters, providing insights into their perception and interaction with the environment. In this paper, we present an efficient yet natural-looking gaze animation model applicable to real-time walking characters exploring natural environments. We address the challenge of dynamic gaze adaptation by combining findings from neuroscience with a data-driven saliency model. Specifically, our model determines gaze focus by considering the character's locomotion, environment stimuli, and terrain conditions. Our model is compatible with both automatic navigation through pre-defined character trajectories and user-guided interactive locomotion, and can be configured according to the desired degree of visual exploration of the environment. Our perceptual evaluation shows that our solution significantly improves the state-of-the-art saliency-based gaze animation with respect to the character's apparent awareness of the environment, the naturalness of the motion, and the elements to which it pays attention.CCS Concepts: Computing methodologies → Computer graphics; Animation; Procedural animationComputing methodologies → Computer graphicsAnimationProcedural animationReactive Gaze during Locomotion in Natural Environments10.1111/cgf.1516812 pages