Attention-Aware Visualization: Tracking and Responding to User Perception Over Time

Research output: Contribution to journalArticlepeer-review

Electronic versions

Links

DOI

We propose the notion of attention-aware visualizations (AAVs) that track the user's perception of a visual representation over time and feed this information back to the visualization. Such context awareness is particularly useful for ubiquitous and immersive analytics where knowing which embedded visualizations the user is looking at can be used to make visualizations react appropriately to the user's attention: for example, by highlighting data the user has not yet seen. We can separate the approach into three components: (1) measuring the user's gaze on a visualization and its parts; (2) tracking the user's attention over time; and (3) reactively modifying the visual representation based on the current attention metric. In this paper, we present two separate implementations of AAV: a 2D data-agnostic method for web-based visualizations that can use an embodied eye tracker to capture the user's gaze, and a 3D data-aware one that uses the stencil buffer to track the visibility of each individual mark in a visualization. Both methods provide similar mechanisms for accumulating attention over time and changing the appearance of marks in response. We also present results from a qualitative evaluation studying visual feedback and triggering mechanisms for capturing and revisualizing attention.
Original languageEnglish
Pages (from-to)1-11
JournalIEEE Transactions on visualization and computer graphics
Early online date9 Sept 2024
DOIs
Publication statusE-pub ahead of print - 9 Sept 2024
View graph of relations