Interval-Based Motion Blending for Hand Grasping

dc.contributor.authorBrisbin, Matten_US
dc.contributor.authorBenes, Bedrichen_US
dc.contributor.editorIk Soo Lim and David Duceen_US
dc.date.accessioned2014-01-31T19:58:15Z
dc.date.available2014-01-31T19:58:15Z
dc.date.issued2007en_US
dc.description.abstractFor motion to appear realistic and believable proper motion blending methods must be used in respect to the goal or task at hand. We present a method that extends the theory of move trees [MBC01] by tagging (attaching) information to each clip within a database at intervals and finding the shortest distance per tag while pruning the tree using convergence priority. Our goal is to retain the physical characteristics of motion capture data while using non-destructive blending in a goal-based scenario. With the intrinsically high dimensionality of a human hand our method also is concerned with intelligent pruning of the move tree. By constructing a move tree for hand grasping scenarios that is sampled per interval within clips and adheres to a convergence priority; we plan to develop a method that will autonomously conform a hand to the object being gen_US
dc.description.seriesinformationTheory and Practice of Computer Graphicsen_US
dc.identifier.isbn978-3-905673-63-0en_US
dc.identifier.urihttps://doi.org/10.2312/LocalChapterEvents/TPCG/TPCG07/201-205en_US
dc.publisherThe Eurographics Associationen_US
dc.subjectCategories and Subject Descriptors (according to ACM CCS): I.3.3 [Computer Graphics]: Motion Blendingen_US
dc.titleInterval-Based Motion Blending for Hand Graspingen_US
Files
Original bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
201-205.pdf
Size:
273.41 KB
Format:
Adobe Portable Document Format