Fast Tree-Field Integrators: From Low Displacement Rank to Topological Transformers

Published: 25 Sept 2024, Last Modified: 06 Nov 2024NeurIPS 2024 posterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Tree Metrics, Low Displacement Rank, Field Integrators, Topological Transformers, Graph Theory, Efficient Algorithms on Graphs
TL;DR: We present a new class of algorithms for the efficient integration of tensor fields defined on weighted trees, with several applications in ML, ranging from mesh modeling to training Topological Transformers.
Abstract: We present a new class of fast polylog-linear algorithms based on the theory of structured matrices (in particular *low displacement rank*) for integrating tensor fields defined on weighted trees. Several applications of the resulting *fast tree-field integrators* (FTFIs) are presented, including: (a) approximation of graph metrics with tree metrics, (b) graph classification, (c) modeling on meshes, and finally (d) *Topological Transformers* (TTs) (Choromanski et al., 2022) for images. For Topological Transformers, we propose new relative position encoding (RPE) masking mechanisms with as few as **three** extra learnable parameters per Transformer layer, leading to **1.0-1.5\%+** accuracy gains. Importantly, most of FTFIs are **exact** methods, thus numerically equivalent to their brute-force counterparts. When applied to graphs with thousands of nodes, those exact algorithms provide **5.7-13x** speedups. We also provide an extensive theoretical analysis of our methods.
Primary Area: Machine learning for other sciences and fields
Submission Number: 12833
Loading