Search Results

Now showing 1 - 2 of 2
  • Item
    Augmented Reality Aided Maintenance for Industrial Applications
    (The Eurographics Association, 2011) Narducci, Fabio; Ricciardi, Stefano; Andrea F. Abate and Michele Nappi and Genny Tortora
    This paper presents an augmented reality system suited for maintenance training and support in industrial environments. We describe how augmented reality can help in reducing the effort of performing sequences of maintenance tasks in complex systems. The proposed approach is aimed to facilitate items localization in technical equipment by means of context-based instructions, virtual labels, 2D and 3D graphics, and animated virtual tools. These virtual contents are transformed to be visualized, co-registered to the real world, through a see-through Head-Mounted Display (HMD). To the aim of enhancing tracking reliability, we report some effective techniques to address drift errors in head tracking. The interaction paradigm is thought to reduce the keyboard usage to make the system less intrusive during user's activity. Deterministic finite automaton (DFA) equivalent representation is exploited to regulate maintenance assistance operations, providing a versatile mean to define either simple or complex procedure in a easy, verifiable and readable way. We also propose a collection of XML tags which enable the conversion of DFA in XML files, ensuring high extensibility and ease of understanding. First experimental results, targeted to augmentation of industrial racks, show a measurable improvement in performing maintenance operations with regard to the time needed to complete the task.
  • Item
    Mixed Reality and Gesture Based Interaction for Medical Imaging Applications
    (The Eurographics Association, 2010) Abate, Andrea F.; Nappi, Michele; Ricciardi, Stefano; Tortora, Genoveffa; Enrico Puppo and Andrea Brogni and Leila De Floriani
    This paper presents a framework providing a collection of techniques to enhance reliability, accuracy and overall effectiveness of gesture-based interaction applied to the manipulation of virtual objects within a Mixed Reality context. We propose an approach characterized by a floating interface, operated by two-hand gestures, for an enhanced manipulation of 3D objects. Our interaction paradigm, exploits one-hand, twohand and time-dependent gesture patterns to allow the user to perform inherently 3D tasks, like arbitrary object rotation, or measurements of relevant features, in a more intuitive yet accurate way. A real-time adaptation to the user s needs is performed by monitoring hands and fingers motions, in order to allow both direct manipulation of virtual objects and conventional keyboard-like operations. The interface layout, whose details depend on the particular application at hand, is visualized via a stereoscopic see-through Head Mounted Display (HMD). It projects virtual interface elements, as well as application related virtual objects, in the central region of the user s field of view, floating in a close-at-hand volume. The application presented here is targeted to interactive 3D visualization of human anatomy resulting from diagnostic imaging or from virtual models aimed at training activities. The testing conducted so far shows a measurable and user-wise perceptible improvement in performing 3D interactive tasks, like the selection of a particular spot on a complex 3D surface or the distance measurement between two 3D landmarks. This study includes both qualitative and quantitative reports on the system usability.