Abstract: Neighborhood aggregation is a key operation in most of the graph neural network-based embedding solutions. Each type of aggregator typically has its best application domain. The single type of aggregator for aggregation adopted by most existing embedding solutions may inevitably result in information loss. To keep the diversity of information during aggregation, it is mandatory to use the most appropriate different aggregators for specific graphs or subgraphs. However, when and what aggregators to be used remain mostly unsolved. To tackle this problem, we introduce a general contrastive learning framework called Cooker, which supports self-supervised adaptive aggregator learning. Specifically, we design three pretext tasks for self-supervised learning and apply multiple aggregators in our model. By doing so, our algorithm can keep the peculiar features of different aggregators in node embeddings and minimize the information loss. Experiment results on node classification and link prediction tasks show that Cooker outperforms the state of the art baselines in all three compared datasets. A set of ablation experiments also demonstrate that the integration of more types of aggregators generally improves the algorithm’s performance and stability.
Loading