Show simple item record

dc.contributor.authorYen, Chi-Hsien Ericen_US
dc.contributor.authorParameswaran, Adityaen_US
dc.contributor.authorFu, Wai-Taten_US
dc.contributor.editorGleicher, Michael and Viola, Ivan and Leitte, Heikeen_US
dc.description.abstractInteractive visualization tools are being used by an increasing number of members of the general public; however, little is known about how, and how well, people use visualizations to infer causality. Adapted from the mediation causal model, we designed an analytic framework to systematically evaluate human performance, strategies, and pitfalls in a visual causal reasoning task. We recruited 24 participants and asked them to identify the mediators in a fictitious dataset using bar charts and scatter plots within our visualization interface. The results showed that the accuracy of their responses as to whether a variable is a mediator significantly decreased when a confounding variable directly influenced the variable being analyzed. Further analysis demonstrated how individual visualization exploration strategies and interfaces might influence reasoning performance. We also identified common strategies and pitfalls in their causal reasoning processes. Design implications for how future visual analytics tools can be designed to better support causal inference are discussed.en_US
dc.publisherThe Eurographics Association and John Wiley & Sons Ltd.en_US
dc.subjectcentered computing
dc.subjectEmpirical studies in visualization
dc.subjectVisualization design and evaluation methods
dc.titleAn Exploratory User Study of Visual Causality Analysisen_US
dc.description.seriesinformationComputer Graphics Forum
dc.description.sectionheadersAnalysis and Decision Making

Files in this item


This item appears in the following Collection(s)

  • 38-Issue 3
    EuroVis 2019 - Conference Proceedings

Show simple item record