ChartQC: Question Classification from Human Attention Data on Charts
Takumi Nishiyasu, Tobias Kostorz, Yao Wang, Yoichi Sato, Andreas Bulling
ETRA Workshop on Eye Tracking and Visualization (ETVIS), pp. 1–6, 2025.

Abstract
Understanding how humans interact with information visualizations is crucial for improving user experience and designing effective visualization systems. While previous studies have focused on task-agnostic visual attention, the relationship between attention patterns and visual analytical tasks remains underexplored. This paper investigates how attention data on charts can be used to classify question types, providing insights into question-driven gaze behaviors. We propose ChartQC, a question classification model leveraging spatial feature alignment in chart images and visual attention data. By aligning spatial features, our approach strengthens the integration of visual and attentional cues, improving classification accuracy. These findings help deepen the understanding of user perception in charts and provide a basis for future research on interactive visual analysis.Links
Paper: nishiyasu25_etvis.pdf
BibTeX
@inproceedings{nishiyasu25_etvis,
title = {ChartQC: Question Classification from Human Attention Data on Charts},
author = {Nishiyasu, Takumi and Kostorz, Tobias and Wang, Yao and Sato, Yoichi and Bulling, Andreas},
year = {2025},
pages = {1--6},
booktitle = {ETRA Workshop on Eye Tracking and Visualization (ETVIS)},
doi = {10.1145/3715669.3725883}
}