Menzel, MalteTauscher, Jan-PhilippMagnor, MarcusGuthe, MichaelGrosch, Thorsten2023-09-252023-09-252023978-3-03868-232-5https://doi.org/10.2312/vmv.20231230https://diglib.eg.org:443/handle/10.2312/vmv20231230This paper presents a method to analyse and evaluate synchronicity in dance performances automatically. Synchronisation of a dancer's movement and the accompanying music is a vital characteristic of dance performances. We propose a method that fuses computer vision-based extraction of dancers' body pose information and audio beat tracking to examine the alignment of the dance motions with the background music. Specifically, the motion of the dancer is analysed for rhythmic dance movements that are then subsequently correlated to the musical beats of the soundtrack played during the performance. Using a single mobile phone video recording of a dance performance only, our system is easily usable in dance rehearsal contexts. Our method evaluates accuracy for every motion beat of the performance on a timeline giving users detailed insight into their performance. We evaluated the accuracy of our method using a dataset containing 17 video recordings of real world dance performances. Our results closely match assessments by professional dancers, indicating correct analysis by our method.Attribution 4.0 International LicenseCCS Concepts: Human-centered computing Information visualization; Applied computing Performing artsHumancentered computing Information visualizationApplied computing Performing artsOn the Beat: Analysing and Evaluating Synchronicity in Dance Performances10.2312/vmv.2023123089-968 pages