From Divergence to Normalized Similarity:A Symmetric and Scalable Topological Toolkit for Representation Analysis

ICLR 2026 Conference Submission25347 Authors

20 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Representation Learning, Topological Data Analysis (TDA), Representation Similarity, Persistent Homology, Neural Network Analysis, Large Language Models (LLMs)
TL;DR: We introduce a topological toolkit to advance representation analysis. SRTD unifies RTD's theoretical framework, while our novel, scale-invariant similarity score, NTS, provides a practical tool for robust, normalized comparisons
Abstract: Representation Topology Divergence (RTD) offers a powerful lens for analyzing topological differences in neural network representations. However, its asymme- try and lack of a normalized scale limit its interpretability and direct comparability across different models. Our work addresses these limitations on two fronts. First, we complete the theoretical framework of RTD by introducing Symmetric Rep- resentation Topology Divergence (SRTD) and its lightweight variant, SRTD-lite. We prove their mathematical properties, demonstrating that they provide a more efficient, comprehensive, and interpretable divergence measure which matches the top performance of existing RTD-based methods in optimization tasks. Second, to overcome the inherent scaling issues of divergence measures, we propose Normal- ized Topological Similarity (NTS), a novel, normalized similarity score robust to representation scale and size. NTS captures the hierarchical clustering structure of representations by comparing their topological merge orders. We demonstrate that NTS can reliably identify inter-layer similarities and, when analyzing representa- tions of Large Language Models (LLMs), provides a more discriminative score than Centered Kernel Alignment (CKA), offering a clearer view of inter-model relationships.
Primary Area: learning on graphs and other geometries & topologies
Submission Number: 25347
Loading