GNN-FiLM: Graph Neural Networks with Feature-wise Linear Modulation

Sep 25, 2019 Blind Submission readers: everyone Show Bibtex
  • Keywords: Graph Neural Networks
  • TL;DR: new GNN formalism + extensive experiments; showing differences between GGNN/GCN/GAT are smaller than thought
  • Abstract: This paper presents a new Graph Neural Network (GNN) type using feature-wise linear modulation (FiLM). Many standard GNN variants propagate information along the edges of a graph by computing ``messages'' based only on the representation of the source of each edge. In GNN-FiLM, the representation of the target node of an edge is additionally used to compute a transformation that can be applied to all incoming messages, allowing feature-wise modulation of the passed information. Results of experiments comparing different GNN architectures on three tasks from the literature are presented, based on re-implementations of baseline methods. Hyperparameters for all methods were found using extensive search, yielding somewhat surprising results: differences between baseline models are smaller than reported in the literature. Nonetheless, GNN-FiLM outperforms baseline methods on a regression task on molecular graphs and performs competitively on other tasks.
  • Original Pdf:  pdf
0 Replies

Loading