Reducing Learning on Cell Complexes to GraphsDownload PDF

Published: 25 Mar 2022, Last Modified: 05 May 2023GTRL 2022 PosterReaders: Everyone
Keywords: graph neural networks, expressiveness, cell complexes, message passing, Weisfeiler Leman
TL;DR: Message passing on regular cell complexes can be done via message passing on graphs without losing expressiveness
Abstract: Message passing graph neural networks (GNNs) are known to have a limited expressiveness in distinguishing graphs. A recent approach tackles this problem by transforming graphs to regular cell complexes. This makes it possible to model higher-order structures and yields algorithms that are more powerful than the Weisfeiler Leman test (WL) or GNNs. However, this approach cannot easily be combined with previous graph algorithms and implementations due to their fundamental differences. We develop Cell Encoding, a novel approach of transforming regular cell complexes to graphs. We show that cell encoding combined with WL or a suitably expressive GNN is at least as expressive as Cellular Weisfeiler Leman (CWL) in distinguishing cell complexes. This means that with a simple preprocessing one can use any GNN for learning tasks on cell complexes. Additionally, we show that this approach can make GNNs more expressive and give better results on graph classification datasets.
Poster: png
3 Replies