Towards Flexible, Efficient, and Effective Tensor Product Networks

Published: 28 Oct 2023, Last Modified: 21 Dec 2023NeurIPS 2023 GLFrontiers Workshop PosterEveryoneRevisionsBibTeX
Keywords: graph neural network, geometric GNN, equivariant, scalarization, tensor product, efficient GNN, pruning
TL;DR: bridge the gap between scalarization and tensor product networks and offer guidance in designing tensor product interactions between steerable components
Abstract: Geometric graph neural networks have showcased exceptional performance in modelling geometric data. These models rely heavily on equivariant operations, encompassing vital techniques such as scalarization and the Clebsch-Gordan tensor product. However, tensor-product-based architectures face substantial computational challenges as the representation order increases, significantly limiting their versatility. Moreover, the interpretability of interactions between steerable components remains elusive. In contrast, scalarization methods benefit from cost-efficient invariant scalar operations while still being capable of outperforming certain tensor-product-based models. To bridge the gap between these approaches, we introduce a conceptual framework that emphasizes the potential flexibility in designing tensor product networks. To provide guidance for efficient framework design and gain deeper insights into steerable components, we conduct a preliminary investigation by pruning tensor product interactions. This approach enables us to directly assess the redundancy and significance of steerable components, paving the way for efficient and effective designs.
Submission Number: 74
Loading