AdaGrid: Adaptive Grid Search for Link Prediction Training ObjectiveDownload PDF

Anonymous

30 Sept 2021 (modified: 12 Mar 2024)NeurIPS 2021 Workshop MetaLearn Blind SubmissionReaders: Everyone
Keywords: graph neural networks, link prediction, meta-learning, training objective
Abstract: One of the most important factors which contribute to the success of a machine learning model is a good training objective. Training objective crucially influences a model’s performance and generalization capabilities. The automated process of designing a good training objective involves optimizing a machine learning process, therefore can be viewed as a meta-learning problem. In this paper, we specifically focus on graph neural network training objectives for link prediction, which has not been explored in the existing literature. Here, the training objective includes, among others, training mode, negative sampling strategy, and various hyperparameters, such as edge message ratio. Commonly, these hyperparameters are fine-tuned by complete grid search, which is very time-consuming and model-dependent. To mitigate these limitations, we propose Adaptive Grid Search (AdaGrid), which dynamically adjusts the edge message ratio during training. It is model agnostic and highly scalable with a fully customizable computational budget. Through extensive experiments, we show that AdaGrid can boost the performance of the models up to $1.9\%$, while can be nine times more efficient than a complete search. Overall, AdaGrid represents an effective automated algorithm for designing machine learning training objectives.
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2203.16162/code)
0 Replies

Loading