4 results
Search Results
Now showing 1 - 4 of 4
Item A Survey on Realistic Virtual Human Animations: Definitions, Features and Evaluations(The Eurographics Association and John Wiley & Sons Ltd., 2024) Rekik, Rim; Wuhrer, Stefanie; Hoyet, Ludovic; Zibrek, Katja; Olivier, Anne-Hélène; Aristidou, Andreas; Macdonnell, RachelGenerating realistic animated virtual humans is a problem that has been extensively studied with many applications in different types of virtual environments. However, the creation process of such realistic animations is challenging, especially because of the number and variety of influencing factors, that should then be identified and evaluated. In this paper, we attempt to provide a clearer understanding of how the multiple factors that have been studied in the literature impact the level of realism of animated virtual humans, by providing a survey of studies assessing their realism. This includes a review of features that have been manipulated to increase the realism of virtual humans, as well as evaluation approaches that have been developed. As the challenges of evaluating animated virtual humans in a way that agrees with human perception are still active research problems, this survey further identifies important open problems and directions for future research.Item FACTS: Facial Animation Creation using the Transfer of Styles(The Eurographics Association, 2024) Saunders, Jack R.; Namboodiri, Vinay P.; Hu, Ruizhen; Charalambous, PanayiotisThe ability to accurately capture and express emotions is a critical aspect of creating believable characters in video games and other forms of entertainment. Traditionally, this animation has been achieved with artistic effort or performance capture, both requiring costs in time and labor. More recently, audio-driven models have seen success, however, these often lack expressiveness in areas not correlated to the audio signal. In this paper, we present a novel approach to facial animation by taking existing animations and allowing for the modification of style characteristics. We maintain the lip-sync of the animations with this method thanks to the use of a novel viseme-preserving loss. We perform quantitative and qualitative experiments to demonstrate the effectiveness of our work.Item Virtual Instrument Performances (VIP): A Comprehensive Review(The Eurographics Association and John Wiley & Sons Ltd., 2024) Kyriakou, Theodoros; Alvarez de la Campa Crespo, Merce; Panayiotou, Andreas; Chrysanthou, Yiorgos; Charalambous, Panayiotis; Aristidou, Andreas; Aristidou, Andreas; Macdonnell, RachelDriven by recent advancements in Extended Reality (XR), the hype around the Metaverse, and real-time computer graphics, the transformation of the performing arts, particularly in digitizing and visualizing musical experiences, is an ever-evolving landscape. This transformation offers significant potential in promoting inclusivity, fostering creativity, and enabling live performances in diverse settings. However, despite its immense potential, the field of Virtual Instrument Performances (VIP) has remained relatively unexplored due to numerous challenges. These challenges arise from the complex and multi-modal nature of musical instrument performances, the need for high precision motion capture under occlusions including the intricate interactions between a musician's body and fingers with instruments, the precise synchronization and seamless integration of various sensory modalities, accommodating variations in musicians' playing styles, facial expressions, and addressing instrumentspecific nuances. This comprehensive survey delves into the intersection of technology, innovation, and artistic expression in the domain of virtual instrument performances. It explores musical performance multi-modal databases and investigates a wide range of data acquisition methods, encompassing diverse motion capture techniques, facial expression recording, and various approaches for capturing audio and MIDI data (Musical Instrument Digital Interface). The survey also explores Music Information Retrieval (MIR) tasks, with a particular emphasis on the Musical Performance Analysis (MPA) field, and offers an overview of various works in the realm of Musical Instrument Performance Synthesis (MIPS), encompassing recent advancements in generative models. The ultimate aim of this survey is to unveil the technological limitations, initiate a dialogue about the current challenges, and propose promising avenues for future research at the intersection of technology and the arts.Item Skeleton-Aware Skin Weight Transfer for Helper Joint Rigs(The Eurographics Association, 2024) Cao, Ziyuan; Mukai, Tomohiko; Hu, Ruizhen; Charalambous, PanayiotisWe propose a method to transfer skin weights and helper joints from a reference model to other targets. Our approach uses two types of spatial proximity to find the correspondence between the target vertex and reference mesh regions. The proposed method first generates a guide weight map to establish a relationship between the skin vertices and skeletal joints using a standard skinning technique. The correspondence between the reference and target skins is established using vertex-to-bone projection and bone-to-skin ray-casting using the guide weights. This method enables fully automated and smooth transfer of skin weight between human-like characters bound to helper joint rigs.