Supervised graph learning with bilevel optimizationDownload PDF

20 Jun 2022 (modified: 05 May 2023)ECMLPKDD 2022 Workshop MLG SubmissionReaders: Everyone
Keywords: Graph learning, Bilevel optimisation.
Abstract: Graph-based learning attracted attention recently due to its efficiency analyzing data lying on graphs. Unfortunately, graphs in real-world are usually either not-given, noisy, or incomplete. In this work, we design a novel algorithm that addresses this issue by training a $G2G$ (Graph to Graph) model with a bilevel optimization framework to learn a better graph in a supervised manner. The trained model operates not only on training data, but generalizes to unseen data points. A bilevel problem comprises two optimization problems, referred to as outer and inner problem. The inner problem aims to solve the downstream task, e.g., training a $GCN$ (Graph Convolutional Network) model, whereas the outer one introduces a new objective function to evaluate the inner model performance, and the $G2G$ model is trained to minimize this function. To solve this optimization, we replace the solution of the inner problem with the output of any gradient-based algorithm proven to give a good surrogate. Then, we use automatic differentiation to compute the gradient of this output w.r.t. the $G2G$ weights, which we consequently learn with a gradient-based algorithm. Experiments on semi-supervised learning datasets show that the graph learned by the $G2G$ model outperforms the original graph by a significant margin.
Dual Submission: No.
0 Replies

Loading