Abstract: Network topology inference, which reconstructs connectivity structures from observational data, is pivotal for optimizing 5 G/6 G network resource allocation, enabling real-time coordination in autonomous drone swarms, and large-scale IoT deployments. However, network topology inference in dynamic communication networks faces two critical challenges: frequent topology shifts caused by link instability and the loss of fine-grained transmission information by discrete handling of network snapshots. In this paper, we propose an end-to-end deep learning model (called GAE-LSTMs) for dynamic topology inference, which combines Graph Attention Networks (GAT), Ordinary Differential Equations (ODE), and stacked Long Short-Term Memory (LSTMs). First, to capture the structural semantics of network topology, we use node2vec to generate low-dimensional embeddings. Then, we leverage GAT and ODE to extract fine-grained spatial evolution features by modeling continuous interactions between nodes. In addition, we design a stacked LSTM module to learn short-term and long-term temporal dependencies across network snapshots. Evaluated on four real-world dynamic networks (Contact, Hypertext09, Radoslaw, and Fbforum), GAE-LSTMs achieves state-of-the-art performance, with average AUC, GMAUC, and error rates of 95.70%, 95.69%, and 4.63%, respectively. The results demonstrate robustness against varying network change frequencies, with performance fluctuations under 7% across time windows.
External IDs:dblp:journals/tnse/WuWDLLL26
Loading