Abstract: Learning graph structures for Graph Neural Networks (GNNs) can improve their performance, but it is challenging to search over the large discrete space of graphs. Prior works often impose fixed structural constraints to promote properties such as sparsity, but these constraints can be misspecified and overly restrictive, potentially degrading performance. Here, we propose a simpler alternative based on marginal likelihood, which naturally favors such properties without requiring any explicit graph constraints. We show that a variational formulation with Laplace's method automatically leads to a marginal-likelihood based objective over discrete graph structures, which can be optimized efficiently using the Gumbel-Softmax trick. We call this approach the Laplace Approximation-based Graph Structure (LAGS) method, and show empirically that it improves the recent state-of-the-art GNNs.
Submission Type: Regular submission (no more than 12 pages of main content)
Assigned Action Editor: ~Samuel_Vaiter1
Submission Number: 7194
Loading