tStudy of Augmented RealityMethods for Real Time Recognitionand Tracking of Untextured 3DModels in Monocular Images
Alvarez Ponga, Hugo
MetadataShow full item record
The main challenge of an augmented reality system is to obtain perfectalignment between real and virtual objects in order to create the illusionthat both worlds coexist. To that end, the position and orientation of theobserver has to be determined in order to configure a virtual camera thatdisplays the virtual objects in their corresponding position. This problemis known as tracking, and although there are many alternatives to addressit by using different sensors, tracking based on optical sensors is the mostpopular solution. However, optical tracking is not a solved problem.This thesis presents a study of the existing optical tracking methodsand provides some improvements for some of them, particularly for thosethat are real time. More precisely, monocular optical marker tracking andmodel-based monocular optical markerless tracking are discussed in detail.The proposed improvements are focused on industrial environments, whichis a difficult challenge due to the lack of texture in these scenes.Monocular optical marker tracking methods do not support occlusions,so this thesis proposes two alternatives: (1) a new tracking method basedon temporal coherence, and (2) a new marker design. Both solutions arerobust against occlusions and do not require more environment adaptation.Similarly, the response of model-based monocular optical markerlesstracking methods is jeopardized in untextured scenes, so this thesis proposesa 3D object recognition method that uses geometric properties instead oftexture to initialize the tracking, as well as a markerless tracking methodthat uses multiple visual cues to update the tracking.Additionally, the details of the augmented reality system that has beendeveloped to help in disassembly operations are given throughout the thesis.This serves as a tool to validate the proposed methods and it also showstheir real world applicability.