Training Graph Neural Networks Subject to a Tight Lipschitz Constraint

Published: 15 May 2024, Last Modified: 15 May 2024Accepted by TMLREveryoneRevisionsBibTeX
Abstract: We propose a strategy for training a wide range of graph neural networks (GNNs) under tight Lipschitz bound constraints. Specifically, by leveraging graph spectral theory, we derive computationally tractable expressions of a tight Lipschitz constant. This allows us to propose a constrained-optimization approach to control the constant, ensuring robustness to adversarial perturbations. Unlike the existing methods for controlling the Lipschitz constant, our approach reduces the size of the handled matrices by a factor equal to the square of the number of nodes in the graph. We employ a stochastic projected subgradient algorithm, which operates in a block-coordinate manner, with the projection step performed via an accelerated iterative proximal algorithm. We focus on defending against attacks that perturb features while keeping the topology of the graph constant. This contrasts with most of the existing defenses, which tackle perturbations of the graph structure. We report experiments on various datasets in the context of node classification tasks, showing the effectiveness of our constrained GNN model.
Submission Length: Long submission (more than 12 pages of main content)
Video: https://youtu.be/_4AI1BmUnLo
Code: https://github.com/simona-juvina/lipschitz-gnn
Assigned Action Editor: ~Andriy_Mnih1
Submission Number: 1994
Loading