Measuring Visual Attention Capacity Across xReality

K. Biermeier, I. Scharlau, E. Yigitbas, in: Proceedings of the 17th International Conference on PErvasive Technologies Related to Assistive Environments (PETRA 2024), ACM, 2024.

Conference Paper | English
Abstract
Augmented (AR) and Virtual Reality (VR) technologies have been applied very broadly in the recent past. While prior work emphasizes the potential of these technologies in various application domains, the process of visual attention in and across the contexts of AR/VR environments is not exhaustively explored yet. By now, visual attention in AR/VR environments has majorly been studied by means of overt attention (i.e. saccadic eye movements), self-report, and process-related visual attention proxies (like reaction time). In this work, we analyze covert visual attention based on the (psychological) Theory of Visual Attention (TVA), which allows us to quantify theory-based interpretable properties of the visual attention process. For example, the TVA allows us to measure the overall processing speed. We instantiate this TVA-based framework with a 30-participant explorative within-subjects study. The results show a decisive difference in visual attention between Reality (i.e. the neutral condition) and Virtual Reality and a weak difference between Reality and Augmented Reality. We discuss the consequences of our findings and provide ideas for future studies.
Publishing Year
Proceedings Title
Proceedings of the 17th International Conference on PErvasive Technologies Related to Assistive Environments (PETRA 2024)
LibreCat-ID

Cite this

Biermeier K, Scharlau I, Yigitbas E. Measuring Visual Attention Capacity Across xReality. In: Proceedings of the 17th International Conference on PErvasive Technologies Related to Assistive Environments (PETRA 2024). ACM; 2024. doi:10.1145/3652037.3652050
Biermeier, K., Scharlau, I., & Yigitbas, E. (2024). Measuring Visual Attention Capacity Across xReality. Proceedings of the 17th International Conference on PErvasive Technologies Related to Assistive Environments (PETRA 2024). https://doi.org/10.1145/3652037.3652050
@inproceedings{Biermeier_Scharlau_Yigitbas_2024, title={Measuring Visual Attention Capacity Across xReality}, DOI={10.1145/3652037.3652050}, booktitle={Proceedings of the 17th International Conference on PErvasive Technologies Related to Assistive Environments (PETRA 2024)}, publisher={ACM}, author={Biermeier, Kai and Scharlau, Ingrid and Yigitbas, Enes}, year={2024} }
Biermeier, Kai, Ingrid Scharlau, and Enes Yigitbas. “Measuring Visual Attention Capacity Across XReality.” In Proceedings of the 17th International Conference on PErvasive Technologies Related to Assistive Environments (PETRA 2024). ACM, 2024. https://doi.org/10.1145/3652037.3652050.
K. Biermeier, I. Scharlau, and E. Yigitbas, “Measuring Visual Attention Capacity Across xReality,” 2024, doi: 10.1145/3652037.3652050.
Biermeier, Kai, et al. “Measuring Visual Attention Capacity Across XReality.” Proceedings of the 17th International Conference on PErvasive Technologies Related to Assistive Environments (PETRA 2024), ACM, 2024, doi:10.1145/3652037.3652050.
All files available under the following license(s):
Copyright Statement:
This Item is protected by copyright and/or related rights. [...]

Link(s) to Main File(s)
Access Level
Restricted Closed Access

Export

Marked Publications

Open Data LibreCat

Search this title in

Google Scholar