Krogmeier, ClaudiaMousas, ChristosHideaki UchiyamaJean-Marie Normand2022-11-292022-11-292022978-3-03868-179-31727-530Xhttps://doi.org/10.2312/egve.20221283https://diglib.eg.org:443/handle/10.2312/egve20221283In this work, we recorded brain activity data from participants who viewed 12 affective character animations in virtual reality. Frontal alpha asymmetry (FAA) scores were calculated from electroencephalography (EEG) data to understand objective affective responses to these animations. A subset of these animations were then annotated as either low FAA (meaning they elicited lower FAA responses), or high FAA (meaning they elicited higher FAA responses). Next, these annotated animations were used in a primary 2×2 study in which we a) examined if we could replicate FAA responses to low FAA and high FAA animations in a subsequent study, and b) investigated how the number of characters in the VR environment would influence FAA responses. Additionally, we compared FAA to self-reported affective responses to the four conditions (one character, low FAA; one character, high FAA; four characters, low FAA; four characters, high FAA). In this way, our research seeks to better understand objective and subjective emotional responses in VR. Results suggest that annotated FAA may not inform FAA responses to affective animations in a subsequent study when more characters are present. However, self-reported affective responses to the four conditions is in line with FAA annotated responses. We offer suggestions for the development of specific affective experiences in VR which are based on preliminary brain activity data.Attribution 4.0 International LicenseCCS Concepts: Human-centered computing -> Virtual reality; User studies; Computing methodologies -> Perception; AnimationHuman centered computingVirtual realityUser studiesComputing methodologiesPerceptionAnimationExploring EEG-Annotated Affective Animations in Virtual Reality: Suggestions for Improvement10.2312/egve.20221283121-13010 pages