Classification ensemble learning via automated graph contrastive learning

Wei Xu, Biao Zhou, Lixiang Xu, Qingzhe Cui, Bing Ai, Shengwei Ji, Yuanyan Tang

Published: 01 Sept 2025, Last Modified: 06 Nov 2025International Journal of Wavelets, Multiresolution and Information ProcessingEveryoneRevisionsCC BY-SA 4.0
Abstract: In recent years, graph contrastive learning, as a self-supervised learning method, has shown superior performance on unlabeled graph classification tasks and has received extensive attention. However, existing graph contrastive learning methods rely heavily on manual selection of augmentation methods for each dataset, or some empirical methods to accomplish data augmentation. To avoid relying too much on human selection of data augmentation methods, while allowing the model to automatically select negative samples during the training phase, we propose a novel approach called Classification Ensemble Learning via Automated Graph Contrastive Learning (AuCo-CEL). In this paper, the sampling distribution of predefined augmentations is optimized by the Bayesian method to automatically select augmentation schemes. Then, we use a scoring function to evaluate the negative sample difficulty, and automatically select negative samples from easy to hard in each training process. Finally, several random forests are integrated by Adaboost algorithm to complete graph classification tasks. Experiments on seven graph datasets verify that our method outperforms other benchmark models.
Loading