Yen, Chi-Hsien EricParameswaran, AdityaFu, Wai-TatGleicher, Michael and Viola, Ivan and Leitte, Heike2019-06-022019-06-0220191467-8659https://doi.org/10.1111/cgf.13680https://diglib.eg.org:443/handle/10.1111/cgf13680Interactive visualization tools are being used by an increasing number of members of the general public; however, little is known about how, and how well, people use visualizations to infer causality. Adapted from the mediation causal model, we designed an analytic framework to systematically evaluate human performance, strategies, and pitfalls in a visual causal reasoning task. We recruited 24 participants and asked them to identify the mediators in a fictitious dataset using bar charts and scatter plots within our visualization interface. The results showed that the accuracy of their responses as to whether a variable is a mediator significantly decreased when a confounding variable directly influenced the variable being analyzed. Further analysis demonstrated how individual visualization exploration strategies and interfaces might influence reasoning performance. We also identified common strategies and pitfalls in their causal reasoning processes. Design implications for how future visual analytics tools can be designed to better support causal inference are discussed.Humancentered computingEmpirical studies in visualizationVisualization design and evaluation methodsAn Exploratory User Study of Visual Causality Analysis10.1111/cgf.13680173-184