Real-time Vision-based Lateral Drift Correction

dc.contributor.authorHübner, Thomasen_US
dc.contributor.authorPajarola, Renatoen_US
dc.contributor.editorP. Alliez and M. Magnoren_US
dc.date.accessioned2015-07-09T11:07:29Z
dc.date.available2015-07-09T11:07:29Z
dc.date.issued2009en_US
dc.description.abstractA major drawback in many robotics projects is the dependance on a specific environment and the otherwise uncertain behavior of the hardware. Simple navigation tasks like driving in a straight line can lead to a strong lateral drift over time in an unknown environment. In this paper we propose a fast and simple solution for the lateral drift problem for vision guided robots by real-time scene analysis. Without an environment-specific calibration of the robot s drive system, we balance the differential drive speed on the fly. Therefore, a feature detector is used on consecutive images. Detected feature points determine the focus of expansion (FOE) that is used for locating and correcting the robot s lateral drift. Results are presented for an unmodified real-world indoor environment that demonstrate that our method is able to correct most lateral drift, solely based on real-time vision processing.en_US
dc.description.sectionheadersImaging, Perception, Displayen_US
dc.description.seriesinformationEurographics 2009 - Short Papersen_US
dc.identifier.doi10.2312/egs.20091037en_US
dc.identifier.pages13-16en_US
dc.identifier.urihttps://doi.org/10.2312/egs.20091037en_US
dc.publisherThe Eurographics Associationen_US
dc.titleReal-time Vision-based Lateral Drift Correctionen_US
Files
Original bundle
Now showing 1 - 2 of 2
Loading...
Thumbnail Image
Name:
013-016.pdf
Size:
773.71 KB
Format:
Adobe Portable Document Format
No Thumbnail Available
Name:
rtvbldc.mov
Size:
2.22 MB
Format:
Video Quicktime
Collections