Flow Divergence: Comparing Hierarchical Network Partitions based on Relative Entropy

Published: 23 Oct 2025, Last Modified: 08 Nov 2025LOG 2025 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Partition Similarity, Relative Entropy, Random Walk
TL;DR: We propose a link-aware partition-similarity measure based on random walks and relative entropy. Our measure can distinguish between partitions where traditional measures fail.
Abstract: Networks model how the entities in complex systems are connected and can be partitioned into communities in different ways. Common approaches for comparing network partitions compute agreements between partitions in terms of set overlaps, however, they ignore link patterns, which are essential for the organisation of networks. We propose \emph{flow divergence}, an information-theoretic divergence measure for comparing network partitions, inspired by the ideas behind the Kullback-Leibler (KL) divergence and describing random walks on networks. Like the KL divergence, flow divergence adopts a coding perspective and compares two network partitions $\mathsf{A}$ and $\mathsf{B}$ by considering the expected extra number of bits required to describe a random walk on a network using "estimate" $\mathsf{B}$ of the network's assumed "true" partition $\mathsf{A}$. Because flow divergence is based on random walks, it can compare hierarchical and non-hierarchical partitions with arbitrary depths. Applied to synthetic and empirical networks, we show that flow divergence distinguishes partitions where traditional measures fail.
Submission Type: Full paper proceedings track submission (max 9 main pages).
Publication Agreement: pdf
Software: https://github.com/mapequation/map-equation-similarity
Poster: png
Poster Preview: png
Submission Number: 29
Loading