Putting Annotations to the Test

dc.contributor.authorBecker, Franziskaen_US
dc.contributor.authorErtl, Thomasen_US
dc.contributor.editorGillmann, Christinaen_US
dc.contributor.editorKrone, Michaelen_US
dc.contributor.editorLenti, Simoneen_US
dc.date.accessioned2023-06-10T06:31:35Z
dc.date.available2023-06-10T06:31:35Z
dc.date.issued2023
dc.description.abstractWhen users work with interactive visualization systems, they get to see more accessible representations of raw data and interact with these, e.g. by filtering the data or modifying the visualization parameters like color. Internal representations such as hunches about trends, outliers or data points of interest, relationships and more are usually not visualized and integrated in systems, i.e. they are not externalized. In addition, how externalizations in visualization systems can affect users in terms of memory, post-analysis recall, speed or analysis quality is not yet completely understood. We present a visualization-agnostic externalization framework that lets users annotate visualizations, automatically connect them to related data and store them for later retrieval. In addition, we conducted a pilot study to test the framework's usability and users' recall of exploratory analysis results. In two tasks, one without and one with annotation features available, we asked participants to answer a question with the help of visualizations and report their findings with concrete examples afterwards. Qualitative analysis of the summaries showed that there are only minor differences in terms of detail or completeness, which we suspect is due to the short task time and consequently more shallow analyses made by participants. We discuss how to improve our framework's usability and modify our study design for future research to gain more insight into externalization effects on post-analysis recall.en_US
dc.description.seriesinformationEuroVis 2023 - Posters
dc.identifier.doi10.2312/evp.20231068
dc.identifier.isbn978-3-03868-220-2
dc.identifier.pages61-63
dc.identifier.pages3 pages
dc.identifier.urihttps://doi.org/10.2312/evp.20231068
dc.identifier.urihttps://diglib.eg.org:443/handle/10.2312/evp20231068
dc.publisherThe Eurographics Associationen_US
dc.rightsAttribution 4.0 International License
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.subjectCCS Concepts: Human-centered computing -> Empirical studies in visualization; Visualization systems and tools; Visual analytics
dc.subjectHuman centered computing
dc.subjectEmpirical studies in visualization
dc.subjectVisualization systems and tools
dc.subjectVisual analytics
dc.titlePutting Annotations to the Testen_US
Files
Original bundle
Now showing 1 - 3 of 3
Loading...
Thumbnail Image
Name:
061-063.pdf
Size:
513.38 KB
Format:
Adobe Portable Document Format
Loading...
Thumbnail Image
Name:
2221-file-i17.pdf
Size:
574.31 KB
Format:
Adobe Portable Document Format
Loading...
Thumbnail Image
Name:
2221-file-i7.pdf
Size:
3.24 MB
Format:
Adobe Portable Document Format