Hofmann, SarahSeeger, MaximilianRogge-Pott, HenningMammen, Sebastian vonRonfard, RĂ©miWu, Hui-Yin2022-04-202022-04-202022978-3-03868-173-12411-9733https://doi.org/10.2312/wiced.20221052https://diglib.eg.org:443/handle/10.2312/wiced20221052Cutting to music is a widely used stylistic device in film making. The usual process involves an editor manually adjusting the movie's sequences contingent upon beat or other musical features. But with today's movie productions starting to leverage real-time systems, manual effort can be reduced. Automatic cameras can make decisions on their own according to pre-defined rules, even in real time. In this paper, we present an approach to automatically create a music video. We have realised its implementation as a coding framework integrating with the fmod api and Unreal Engine 4. The framework provides the means to analyze a music stream at runtime and to translate the extracted features into an animation story line, supported by cinematic cutting. We demonstrate its workings by means of an instance of an artistic, music-driven movie.CCS Concepts: Computer systems organization --> Real-time operating systems; Applied computing --> Sound and music computing; Media artsComputer systems organizationReal-time operating systemsApplied computingSound and music computingMedia artsReal-Time Music-Driven Movie Design Framework10.2312/wiced.2022105253-608 pages