Putting Annotations to the Test

Thumbnail Image
Journal Title
Journal ISSN
Volume Title
The Eurographics Association
When users work with interactive visualization systems, they get to see more accessible representations of raw data and interact with these, e.g. by filtering the data or modifying the visualization parameters like color. Internal representations such as hunches about trends, outliers or data points of interest, relationships and more are usually not visualized and integrated in systems, i.e. they are not externalized. In addition, how externalizations in visualization systems can affect users in terms of memory, post-analysis recall, speed or analysis quality is not yet completely understood. We present a visualization-agnostic externalization framework that lets users annotate visualizations, automatically connect them to related data and store them for later retrieval. In addition, we conducted a pilot study to test the framework's usability and users' recall of exploratory analysis results. In two tasks, one without and one with annotation features available, we asked participants to answer a question with the help of visualizations and report their findings with concrete examples afterwards. Qualitative analysis of the summaries showed that there are only minor differences in terms of detail or completeness, which we suspect is due to the short task time and consequently more shallow analyses made by participants. We discuss how to improve our framework's usability and modify our study design for future research to gain more insight into externalization effects on post-analysis recall.

CCS Concepts: Human-centered computing -> Empirical studies in visualization; Visualization systems and tools; Visual analytics

, booktitle = {
EuroVis 2023 - Posters
}, editor = {
Gillmann, Christina
Krone, Michael
Lenti, Simone
}, title = {{
Putting Annotations to the Test
}}, author = {
Becker, Franziska
Ertl, Thomas
}, year = {
}, publisher = {
The Eurographics Association
}, ISBN = {
}, DOI = {
} }