Learning continuous dynamic network representation with transformer-based temporal graph neural network

Published: 01 Jan 2023, Last Modified: 10 Feb 2025Inf. Sci. 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Highlights•Continuous DGNN methods can capture fine-granularity temporal information.•Continuous DGNN ignore the importance of global information in dynamic networks.•NODE is utilized to model the time trajectory change process of dynamic networks.•Transformer mechanisms can aggregate temporal and structural information.•Extracting global topology information by stacking transformers in multiple layers.
Loading