Track: Extended Abstract (non-archival, 4 pages)
Keywords: Deep Learning, Algebraically-Informed Deep Networks, p[
TL;DR: Deep Learning, Algebraically-Informed Deep Networks
Abstract: Building learning systems that can automatically uncover underlying mathematical laws from observed data is one of the central problems in the interface of deep learning and mathematics. In this work, we make one step towards building the bridge between algebraic structures and deep learning, and introduce \textbf{AIDN}, \textit{Algebraically-Informed Deep Networks}. \textbf{AIDN} is a deep learning algorithm to represent any finitely-presented algebraic object with a set of deep neural networks. The deep networks obtained via \textbf{AIDN} are \textit{algebraically-informed} in the sense that they satisfy the algebraic relations of the presentation of the algebraic structure that serves as the input to the algorithm. Our proposed network can robustly compute linear and non-linear representations of most finitely-presented algebraic structures such as groups, associative algebras, and Lie algebras. We evaluate our proposed approach and demonstrate its applicability to algebraic and geometric objects that are significant in low-dimensional topology. In particular, we study solutions for the Yang-Baxter equations and their applications on braid groups. Further, we study the representations of the Temperley-Lieb algebra. Finally, we show, using the Reshetikhin-Turaev construction, how the proposed deep learning approach can be utilized to construct new link invariants.
Supplementary Material: zip
Submission Number: 16
Loading