Keywords: Diffusion Maps, Self-Attention, Magnetic Laplacian, Manifold Learning, Kernel Methods, Random-Walk
Abstract: We show that the diffusion map affinity matrix is the twisted Hadamard product of the self-attention matrix. Concretely, let the generalized feature-similarity matrix be $\mathcal W = M + A$ with $M=M^\dagger$ a Hermitian (real part, encoding geometry) and $A=-A^\dagger$ skew-Hermitian (imaginary part, encoding directionality). Softmax applied to the real logits from $M$ yields a first-order, row/column-stochastic attention operator. The diffusion kernel then arises as the twisted Hadamard product (a Product-of-Experts identity), producing a symmetric second-order affinity whose spectrum matches diffusion maps. The skew part $A$ contributes only phases; placing them outside the softmax yields a $U(1)$ gauge-equivariant ``magnetic'' variant without breaking stochasticity.
Primary Area: learning on graphs and other geometries & topologies
Submission Number: 22024
Loading