GA-ReLU: an activation function for Geometric Algebra Networks applied to 2D Navier-Stokes PDEs

Published: 03 Mar 2024, Last Modified: 30 Apr 2024AI4DiffEqtnsInSci @ ICLR 2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: geometric deep learning, Navier Stokes, Clifford Algebra, activation function
TL;DR: We adapt ReLU for Clifford multivectors to keep the geometric approach of Clifford networks also when applying non-linearities
Abstract: Many differential equations describing physical phenomena are intrinsically geometric in nature. It has been demonstrated how this geometric structure of data can be captured effectively through networks sitting in Geometric Algebra (GA) that work with multivectors, making them suitable candidates to solve differential equations. GA networks however, are still mostly uncharted territory. In this paper we focus on non-linearities, since applying them to multivectors is not a trivial task: they are generally applied in a point-wise fashion over each real-valued component of a multivector. This approach discards interactions between different elements of the multivector input and compromises the geometric nature of GA networks. To bridge this gap, we propose GA-ReLU, a GA approach to the rectified linear unit (ReLU), and show how it can improve the solution of Navier-Stokes PDEs.
Submission Number: 57
Loading