Abstract: Data visualization has evolved from a purely human-centric craft to a dual-purpose tool consumed by both humans and machine-driven models. However, most existing evaluations focus primarily on aesthetics and clarity for human users, overlooking machine interpretability. To bridge this gap, this study introduces HyVis (A Hybrid Visualization assessment for balancing human readability and machine comprehension), a framework for evaluating visualization quality by combining human preference criteria and model interpretability. Unlike prior studies focused on human perception, HyVis integrates model readability, ensuring visualizations are interpretable for machine-driven analysis.
Experimental results demonstrate that HyVis improves human preference-based evaluations by up to 16% and achieves a 3.14% higher accuracy in machine-readable assessments compared to large-scale models.
Paper Type: Long
Research Area: Multimodality and Language Grounding to Vision, Robotics and Beyond
Research Area Keywords: Multimodal Learning, Explainability and Interpretability, chart understanding
Contribution Types: Data analysis
Languages Studied: English
Submission Number: 6495
Loading