Rethinking multi-level information fusion in temporal graphs: Pre-training then distilling for better embedding

Published: 01 Jan 2025, Last Modified: 22 Aug 2025Inf. Fusion 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Highlights•Multi-level information is important in temporal graphs and is difficult to obtain efficiently with existing methods.•We point out that such information can be flexibly captured by “pre-training then distilling” pattern.•We realize the knowledge distilling of different levels of information by introducing a pre-training module.•Experimental results demonstrate our method can improve performance while avoiding excessive computation.
Loading