OpenGLA: Topology and Task Adaptive Foundation Model for Power System Graph-Language Answering

19 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Multi-modal model, Foundation model for power systems
TL;DR: We introduce OpenGLA, a graph–language foundation model for power systems that aligns graph encoders with LLMs, achieving accurate, efficient, and generalizable performance across diverse tasks and grid topologies.
Abstract: Foundation models have shown impressive cross-modal generation and problem-solving abilities, yet directly applying them to power systems remains challenging due to strict requirements on accuracy, efficiency, and physical interpretability. We propose OpenGLA, a distinct graph–language foundation model for power systems. OpenGLA encodes grid states with a topology-adaptive GCN and an adaptive nodal feature encoder, projects them into the language embedding space, and fuses with textual instructions via a Mixture-of-Transformers module. A lightweight transformer detokenizer is designed to enable precise floating-point outputs. Experiments demonstrate that OpenGLA generalizes across diverse tasks and grid topologies while achieving superior accuracy, establishing a scalable foundation model architecture in critical infrastructure.
Supplementary Material: zip
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Submission Number: 16032
Loading