Chartist: Task-driven Eye Movement Control for Chart Reading
Danqing Shi, Yao Wang, Yunpeng Bai, Andreas Bulling, Antti Oulasvirta
Proc. ACM SIGCHI Conference on Human Factors in Computing Systems (CHI), pp. 1–14, 2025.
Abstract
To design data visualizations that are easy to comprehend, we need to understand how people with different interests read them. Computational models of modeling scanpaths on charts could complement empirical studies by offering estimates of user performance inexpensively; however, previous models have been limited to gaze patterns and overlooked the effects of tasks. Here, we contribute a model that simulates how users move their eyes to extract information from the chart in order to solve analytical tasks, including retrieve value, filter, and find extreme. Our insight is a two-level hierarchical control structure. At the high level, the model uses a LLM to comprehend information gained so far and uses this representation to select a goal for the lower-level controllers, which in turn move the eyes according to a sampling policy learned via reinforcement learning. The model can accurately predict task-driven scanpaths and reproduce the human-like statistical summary across tasks.Links
BibTeX
@inproceedings{shi25_chi,
title = {Chartist: Task-driven Eye Movement Control for Chart Reading},
author = {Shi, Danqing and Wang, Yao and Bai, Yunpeng and Bulling, Andreas and Oulasvirta, Antti},
year = {2025},
pages = {1--14},
booktitle = {Proc. ACM SIGCHI Conference on Human Factors in Computing Systems (CHI)},
doi = {}
}