GraphRNN Revisited: An Ablation Study and Extensions for Directed Acyclic Graphs

Published: 28 Oct 2023, Last Modified: 21 Dec 2023NeurIPS 2023 GLFrontiers Workshop PosterEveryoneRevisionsBibTeX
Keywords: graph neural networks, machine learning, generative AI
TL;DR: Reproducibility experiments for GraphRNN model, evaluation on expanded metrics, ablation study of breadth-first search traversal, and novel extension for generation of directed acyclic graphs.
Abstract: GraphRNN is a deep learning-based architecture proposed by You et al. for learning generative models for graphs. We replicate the results of You et al. using a reproduced implementation of the GraphRNN architecture and evaluate this against baseline models using new metrics. Through an ablation study, we find that the BFS traversal suggested by You et al. to collapse representations of isomorphic graphs contributes significantly to model performance. Additionally, we extend GraphRNN to generate directed acyclic graphs by replacing the BFS traversal with a topological sort. We demonstrate that this method improves significantly over a directed-multiclass variant of GraphRNN on a real-world dataset.
Submission Number: 72
Loading