Betti numbers of attention graphs is all you really needDownload PDF

Anonymous

10 Oct 2020 (modified: 05 May 2023)Submitted to TDA & Beyond 2020Readers: Everyone
Keywords: attention graphs, attention-based model, self-attention, attention topology, BERT, persistent homology, betti numbers, linguistic acceptability, sentiment analysis, spam detection
TL;DR: We apply methods of topological analysis to the attention graphs, calculated on the attention heads of the BERT model
Abstract: We apply methods of topological analysis to the attention graphs, calculated on the attention heads of the BERT model (Devlin et al. (2019)). Our research shows that the classifier built upon basic persistent topological features (namely, Betti numbers) of the trained neural network can achieve classification results on par with the conventional classification method. We show the relevance of such topological text representation on three text classification benchmarks. For the best of our knowledge, it is the first attempt to analyze the topology of an attention-based neural network, widely used for Natural Language Processing.
Previous Submission: No
1 Reply

Loading