Enge, KajetanRind, AlexanderIber, MichaelHöldrich, RobertAigner, WolfgangAgus, MarcoAigner, WolfgangHoellt, Thomas2022-06-022022-06-022022978-3-03868-184-7https://doi.org/10.2312/evs.20221095https://diglib.eg.org:443/handle/10.2312/evs20221095The metaphor of auscultating with a stethoscope can be an inspiration to combine visualization and sonification for exploratory data analysis. This paper presents SoniScope, a multimodal approach and its prototypical implementation based on this metaphor. It combines a scatterplot with an interactive parameter mapping sonification, thereby conveying additional information about items that were selected with a visual lens. SoniScope explores several design options for the shape of its lens and the sorting of the selected items for subsequent sonification. Furthermore, the open-source prototype serves as a blueprint framework for how to combine D3.js visualization and SuperCollider sonification in the Jupyter notebook environment.Attribution 4.0 International LicenseCCS Concepts: Human-centered computing --> Visualization systems and tools; Auditory feedback; Sound-based input / outputHuman centered computingVisualization systems and toolsAuditory feedbackSoundbased input / outputTowards Multimodal Exploratory Data Analysis: SoniScope as a Prototypical Implementation10.2312/evs.2022109567-715 pages