Abstract: Although graph neural networks (GNNs) can extract the latent relationship-level knowledge among the graph nodes and have achieved excellent performance in unsupervised scenarios, it is weak in learning the instance-level knowledge in contrast to the convolution neural networks (CNNs). Besides, lacking of the graph structure limits the extension of GNNs on non-graph datasets. To solve these problems, we propose a novel unsupervised multi-level knowledge fusion network. It successfully unifies the instance-level and relationship-level knowledge on the non-graph data by distillation from a pre-trained CNN teacher to a GNN student. Meanwhile, a sparse weighted strategy is designed to adaptively extract the sparse graph topology and extend the GNN on non-graph datasets. By optimization of distillation loss, the "boosted'' GNN student can learn the multi-level knowledge and extract more discriminative deep embeddings for clustering. Finally, extensive experiments show it has achieved excellent performance compared with the current methods.
Loading