TOURAINE, DamienBOURDOT, PatrickBELLIK, YacineBOLOT, LaurenceS. Mueller and W. Stuerzlinger2014-01-272014-01-2720021-58113-535-11727-530Xhttps://doi.org/10.2312/EGVE/EGVE02/159-168This paper describes the EVI3d framework, a distributed architecture developed to enhance interactions within Virtual Environments (VE). This framework manages many multi-sensorial devices such as trackers, data gloves, and speech or gesture recognition systems as well as haptic devices. The structure of this architecture allows a complete dispatching of device services and their clients on as many machines as required. With the dated events provided by its time synchronization system, it becomes possible to design a specific module to manage multimodal fusion processes. To this end, we describe how the EVI3d framework manages not only low-level events but also abstract modalities. Moreover, the data flow service of the EVI3d framework solves the problem of sharing the virtual scene between modality modules.A framework to manage multimodal fusion of events for advanced interactions within Virtual Environments