A framework to manage multimodal fusion of events for advanced interactions within Virtual Environments

dc.contributor.authorTOURAINE, Damienen_US
dc.contributor.authorBOURDOT, Patricken_US
dc.contributor.authorBELLIK, Yacineen_US
dc.contributor.authorBOLOT, Laurenceen_US
dc.contributor.editorS. Mueller and W. Stuerzlingeren_US
dc.date.accessioned2014-01-27T10:15:27Z
dc.date.available2014-01-27T10:15:27Z
dc.date.issued2002en_US
dc.description.abstractThis paper describes the EVI3d framework, a distributed architecture developed to enhance interactions within Virtual Environments (VE). This framework manages many multi-sensorial devices such as trackers, data gloves, and speech or gesture recognition systems as well as haptic devices. The structure of this architecture allows a complete dispatching of device services and their clients on as many machines as required. With the dated events provided by its time synchronization system, it becomes possible to design a specific module to manage multimodal fusion processes. To this end, we describe how the EVI3d framework manages not only low-level events but also abstract modalities. Moreover, the data flow service of the EVI3d framework solves the problem of sharing the virtual scene between modality modules.en_US
dc.description.seriesinformationEurographics Workshop on Virtual Environmentsen_US
dc.identifier.isbn1-58113-535-1en_US
dc.identifier.issn1727-530Xen_US
dc.identifier.urihttps://doi.org/10.2312/EGVE/EGVE02/159-168en_US
dc.publisherThe Eurographics Associationen_US
dc.titleA framework to manage multimodal fusion of events for advanced interactions within Virtual Environmentsen_US
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
159-168.pdf
Size:
470.96 KB
Format:
Adobe Portable Document Format