VE: Eurographics Workshop on Virtual Environments - Short Papers
Permanent URI for this community
Browse
Browsing VE: Eurographics Workshop on Virtual Environments - Short Papers by Subject "Categories and Subject Descriptors (according to ACM CCS): H.5.1 [Information Interfaces and Presentation]: Artificial, augmented, and virtual realities"
Now showing 1 - 2 of 2
Results Per Page
Sort Options
Item A Lightweight ID-Based Extension for Marker Tracking Systems(The Eurographics Association, 2007) Flohr, Daniel; Fischer, Jan; Bernd Froehlich and Roland Blach and Robert van LiereThe estimation of the position and orientation of the digital video camera is a central challenge in video seethrough augmented reality. Many augmented reality applications solve this problem with the help of markerbased methods, which analyze artificial fiducials in the images of the real environment, e.g., using the widespread ARToolKit library. Among the drawbacks of conventional marker tracking is the necessity to manually define marker patterns. Badly chosen patterns have a negative impact on tracking performance. Although improved methods for automatic marker generation have been described, manually controlled marker tracking is still widely used in many applications for practical reasons. In this paper, we describe a lightweight drop-in extension for IDbased marker tracking. Our system makes it possible to automatically generate a large number of tracking fiducials identified by unique numerical IDs. The created marker patterns consists of large monochrome patches, which improves the recognition rate and tracking performance compared to typical manually defined fiducials. Due to the design of our extension, only minimal adaptations are required in order to add ID-based tracking to existing augmented reality software. We discuss experimental results demonstrating the improved pattern recognition and describe an example application.Item Visible Portion Estimation of Moving Target Objects for Networked Wearable Augmented Reality(The Eurographics Association, 2008) Makita, Koji; Kanbara, M.; Yokoya, N.; Robert van Liere and Betty MohlerThis paper describes a new visible portion estimation method of moving target object for networked wearable augmented reality (AR) system. In annotation overlay applications using AR systems, it is important to improve readability and intelligibility of annotations in a user's view. View management makes it possible to appropriately generate annotation overlay images for users so as to intuitively understand annotations. For instance, overlappings of annotations and other objects can be prevented by using a view management technique. View management requires visible portions of 3D target objects in a 2D view plane. This paper proposes a visible portion estimation method for moving target objects based on positions and shapes of the moving target objects. The wearable augmented reality system obtains positions of target objects via wireless network for estimating visible portions of target objects in the user's view. Annotations are able to be overlaid by using view management techniques with our proposed visible portion estimation.