Show simple item record

dc.contributor.authorAristidou, Andreasen_US
dc.contributor.authorZeng, Qiongen_US
dc.contributor.authorStavrakis, Efstathiosen_US
dc.contributor.authorYin, KangKangen_US
dc.contributor.authorCohen-Or, Danielen_US
dc.contributor.authorChrysanthou, Yiorgosen_US
dc.contributor.authorChen, Baoquanen_US
dc.contributor.editorBernhard Thomaszewski and KangKang Yin and Rahul Narainen_US
dc.date.accessioned2017-12-31T10:44:59Z
dc.date.available2017-12-31T10:44:59Z
dc.date.issued2017
dc.identifier.isbn978-1-4503-5091-4
dc.identifier.issn1727-5288
dc.identifier.urihttp://dx.doi.org/10.1145/3099564.3099566
dc.identifier.urihttps://diglib.eg.org:443/handle/10.1145/3099564-3099566
dc.description.abstractMotion capture technology has enabled the acquisition of high quality human motions for animating digital characters with extremely high fidelity. However, despite all the advances in motion editing and synthesis, it remains an open problem to modify pre-captured motions that are highly expressive, such as contemporary dances, for stylization and emotionalization. In this work, we present a novel approach for stylizing such motions by using emotion coordinates de ned by the Russell's Circumplex Model (RCM).We extract and analyze a large set of body and motion features, based on the Laban Movement Analysis (LMA), and choose the e ective and consistent features for characterizing emotions of motions. These features provide a mechanism not only for deriving the emotion coordinates of a newly input motion, but also for stylizing the motion to express a di erent emotion without having to reference the training data. Such decoupling of the training data and new input motions eliminates the necessity of manual processing and motion registration. We implement the two-way mapping between the motion features and emotion coordinates through Radial Basis Function (RBF) regression and interpolation, which can stylize freestyle highly dynamic dance movements at interactive rates. Our results and user studies demonstrate the e ectiveness of the stylization framework with a variety of dance movements exhibiting a diverse set of emotions.en_US
dc.publisherACMen_US
dc.subjectComputing methodologies Motion processing
dc.subjectMotion capture
dc.subjectInformation systems Sentiment analysis
dc.subjectComputer Graphics
dc.subjectcharacter animation
dc.subjectdata
dc.subjectdriven motion style transfer
dc.subjectmotion editing
dc.subjectmotion synthesis.
dc.titleEmotion Control of Unstructured Dance Movementsen_US
dc.description.seriesinformationEurographics/ ACM SIGGRAPH Symposium on Computer Animation
dc.description.sectionheadersPapers III: Kinematic Characters
dc.identifier.doi10.1145/3099564.3099566
dc.identifier.pagesAndreas Aristidou, Qiong Zeng, Efstathios Stavrakis, KangKang Yin, Daniel Cohen-Or, Yiorgos Chrysanthou, and Baoquan Chen-Computing methodologies Motion processing; Motion capture; Information systems Sentiment analysis; Computer Graphics, character animation, data-driven motion style transfer, motion editing, motion synthesis.


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record