Ritter, ChristianAltenhofen, ChristianZeppelzauer, MatthiasKuijper, ArjanSchreck, TobiasBernard, JürgenChristian Tominski and Tatiana von Landesberger2018-06-022018-06-022018978-3-03868-064-2https://diglib.eg.org:443/handle/10.2312/eurova20181109https://doi.org/10.2312/eurova.20181109We present an interactive visual music classification tool that will allow users to automatically structure music collections in a personalized way. With our approach, users play an active role in an iterative process of building classification models, using different interactive interfaces for labeling songs. The interactive tool conflates interfaces for the detailed analysis at different granularities, i.e., audio features, music songs, as well as classification results at a glance. Interactive labeling is provided with three complementary interfaces, combining model-centered and human-centered labeling-support principles. A clean visual design of the individual interfaces depicts complex model characteristics for experts, and indicates our work-inprogress towards the abilities of non-experts. The result of a preliminary usage scenario shows that, with our system, hardly any knowledge about machine learning is needed to create classification models of high accuracy with less than 50 labels.Humancentered computingVisualization application domainsComputing methodologiesMachine learningPersonalized Visual-Interactive Music Classification10.2312/eurova.2018110931-35