Abstract: Modeling continuous-time dynamics on graphs constitutes a foundational challenge, and uncovering inter-component correlations within complex systems holds promise for enhancing the efficacy of dynamic modeling. The fusion of graph neural networks with ordinary differential equations has showcased commendable performance in this domain. However, the prevailing method disregards the crucial signed information intrinsic to graphs, impeding their capacity to accurately capture real-world phenomena and leading to subpar outcomes. In response, we introduce a novel approach: a signed graph ordinary differential equation, adeptly addressing the limitations associated with capturing signed information. Our proposed solution boasts both flexibility and efficiency, seamlessly integrating within diverse graph-based dynamic modeling frameworks. To substantiate its effectiveness, we seamlessly integrate our devised strategies into three preeminent paradigms—graph neural ordinary differential equations, graph neural controlled differential equations, and graph recurrent neural networks baselines. Rigorous assessments encompass three intricate dynamic scenarios from physics and biology, as well as scrutiny across four authentic real-world traffic datasets. Remarkably outperforming the trio of baselines, empirical results underscore the substantial performance enhancements facilitated by our proposed approach.
Loading