Covariant Compositional Networks For Learning GraphsDownload PDF

11 Feb 2018 (modified: 23 Jan 2023)ICLR 2018 Workshop SubmissionReaders: Everyone
Keywords: graph neural networks, message passing, label propagation, high order representation
TL;DR: A general framework for creating covariant graph neural networks
Abstract: Most existing neural networks for learning graphs deal with the issue of permutation invariance by conceiving of the network as a message passing scheme, where each node sums the feature vectors coming from its neighbors. We argue that this imposes a limitation on their representation power, and instead propose a new general architecture for representing objects consisting of a hierarchy of parts, which we call Covariant Compositional Networks (CCNs). Here covariance means that the activation of each neuron must transform in a specific way under permutations, similarly to steerability in CNNs. We achieve covariance by making each activation transform according to a tensor representation of the permutation group, and derive the corresponding tensor aggregation rules that each neuron must implement. Experiments show that CCNs can outperform competing methods on some standard graph learning benchmarks.
Code: [![github](/images/github_icon.svg) HyTruongSon/GraphFlow](https://github.com/HyTruongSon/GraphFlow) + [![Papers with Code](/images/pwc_icon.svg) 1 community implementation](https://paperswithcode.com/paper/?openreview=S1TgE7WR-)
Data: [MUTAG](https://paperswithcode.com/dataset/mutag), [NCI1](https://paperswithcode.com/dataset/nci1), [PTC](https://paperswithcode.com/dataset/ptc), [QM9](https://paperswithcode.com/dataset/qm9)
1 Reply

Loading