On Backpropagation-Free Graph ConvolutionsDownload PDF

23 Jun 2022, 10:47 (modified: 13 Sept 2022, 09:31)ECMLPKDD 2022 Workshop MLG SubmissionReaders: Everyone
Keywords: Graph Convolutional Networks, Graph Neural Network, Deep Learning, Structured Data, Machine Learning on Graphs
Abstract: In this paper, we present neural models for graphs that do not rely on backpropagation for training. This makes learning more biologically plausible and amenable to parallel implementation in hardware. The base component of our architecture is a generalization of Gated Linear Networks which allows the adoption of multiple graph convolutions. Every neuron is a \emph{set} of graph convolution filters (weight vectors) and a gating mechanism that selects the weight vector to use for processing based on the node and its topological context. We focus on a message-passing aggregation scheme where the gating mechanism is embedded directly into the graph convolution. We compare the effectiveness of different definitions of node contexts (depending on input or hidden features) and of gating functions (based on hyper-planes or on prototypes). We evaluate the proposed convolutions on several node classification benchmark datasets. The experimental results show that our backpropagation-free graph convolutions are competitive with backpropagation-based counterparts. Moreover, we present a theoretical result on the expressiveness of the proposed models.
Dual Submission: An extended version of this work is under review at IEEE International Conference on Data Mining (ICDM)
0 Replies