Tracking Hands in Action for Gesture-based Computer Input

dc.contributor.authorSrinath, Sridhar
dc.date.accessioned2017-03-31T14:46:32Z
dc.date.available2017-03-31T14:46:32Z
dc.date.issued2016-12-16
dc.description.abstractThis thesis introduces new methods for markerless tracking of the full articulated motion of hands and for informing the design of gesture-based computer input. Emerging devices such as smartwatches or virtual/augmented reality glasses are in need of new input devices for interaction on the move. The highly dexterous human hands could provide an always-on input capability without the actual need to carry a physical device. First, we present novel methods to address the hard computer vision-based hand tracking problem under varying number of cameras, viewpoints, and run-time requirements. Second, we contribute to the design of gesture-based interaction techniques by presenting heuristic and computational approaches. The contributions of this thesis allow users to effectively interact with computers through markerless tracking of hands and objects in desktop, mobile, and egocentric scenarios.en_US
dc.identifier.urihttps://diglib.eg.org:443/handle/10.2312/2631226
dc.language.isoenen_US
dc.titleTracking Hands in Action for Gesture-based Computer Inputen_US
dc.typeThesisen_US
Files
Original bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
Sridhar.pdf
Size:
45.52 MB
Format:
Adobe Portable Document Format
Description:
Collections