Enhancing Graph Generation With First-Order Logic Rules

19 Sept 2025 (modified: 16 Dec 2025)ICLR 2026 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Deep graph generation, First-Order Logic Rules, Neuro-Symbolic AI, Deep Learning
TL;DR: Shows how deep graph generation can be enhanced with domain knowledge represented by first-order logic rules with a novel semantic loss function
Abstract: Existing graph generative models produce graphs that are often quite realistic, but sometimes miss domain-specific patterns. Enhancing graph learning with domain knowledge is one of the current frontiers for neural models of graph data. In this paper, we propose a new approach to enhancing deep graph generative models with knowledge that is represented by first-order logic rules. First-order logic provides an expressive formalism for representing interpretable causal knowledge about relational structures. Our conceptual contribution is a new first-order semantic loss function for training a graph generative model on relational data: maximize the model likelihood subject to a rule moment matching constraint, namely that the expected instance count of each rule matches its observed instance count. Our algorithmic contribution is a novel method for computing the expected instance count of a first-order rule for a Variational Graph Autoencoder model, based on matrix multiplication. Empirical evaluation on five benchmark datasets, both homogeneous and heterogeneous, shows that rule moment matching improves the quality of generated graphs substantially (by orders of magnitude on standard graph quality metrics), and improves predictive accuracy on the downstream task of node classification.
Supplementary Material: zip
Primary Area: generative models
Submission Number: 15505
Loading