Utilizing Edge Features in Graph Neural Networks via Variational Information MaximizationDownload PDF

25 Sept 2019 (modified: 05 May 2023)ICLR 2020 Conference Blind SubmissionReaders: Everyone
Keywords: Graph Neural Network, Edge Feature, Mutual Information
TL;DR: We use a principled variational approach to preserve edge information in graph neural networks and show the importance of edge features and the superior of our method in extensive benchmarks.
Abstract: Graph Neural Networks (GNNs) broadly follow the scheme that the representation vector of each node is updated recursively using the message from neighbor nodes, where the message of a neighbor is usually pre-processed with a parameterized transform matrix. To make better use of edge features, we propose the Edge Information maximized Graph Neural Network (EIGNN) that maximizes the Mutual Information (MI) between edge features and message passing channels. The MI is reformulated as a differentiable objective via a variational approach. We theoretically show that the newly introduced objective enables the model to preserve edge information, and empirically corroborate the enhanced performance of MI-maximized models across a broad range of learning tasks including regression on molecular graphs and relation prediction in knowledge graphs.
Code: https://drive.google.com/file/d/1HtOWRuLBcuggsSIrEjHC-1-Lq8W-8KYb/view?usp=sharing
Original Pdf: pdf
9 Replies

Loading