Tracking Hands in Action for Gesture-based Computer Input

Eurographics DL Repository

Show simple item record

dc.contributor.author Srinath, Sridhar
dc.date.accessioned 2017-03-31T14:46:32Z
dc.date.available 2017-03-31T14:46:32Z
dc.date.issued 2016-12-16
dc.identifier.uri https://diglib.eg.org:443/handle/10.2312/2631226
dc.description.abstract This thesis introduces new methods for markerless tracking of the full articulated motion of hands and for informing the design of gesture-based computer input. Emerging devices such as smartwatches or virtual/augmented reality glasses are in need of new input devices for interaction on the move. The highly dexterous human hands could provide an always-on input capability without the actual need to carry a physical device. First, we present novel methods to address the hard computer vision-based hand tracking problem under varying number of cameras, viewpoints, and run-time requirements. Second, we contribute to the design of gesture-based interaction techniques by presenting heuristic and computational approaches. The contributions of this thesis allow users to effectively interact with computers through markerless tracking of hands and objects in desktop, mobile, and egocentric scenarios. en_US
dc.language.iso en en_US
dc.title Tracking Hands in Action for Gesture-based Computer Input en_US
dc.type Thesis en_US


Files in this item

Item/paper (currently) not available via TIB Hannover.

This item appears in the following Collection(s)

Show simple item record

Search Eurographics DL


Browse

My Account