PyTorch Geometric High Order: A Unified Library for High Order Graph Neural Network

22 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Supplementary Material: zip
Primary Area: infrastructure, software libraries, hardware, etc.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: High Order Graph Neural Network, Library
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: We introduce PyTorch Geometric High Order (PyGHO), a library designed for High Order Graph Neural Networks (HOGNNs), which pass messages between node tuples rather than nodes.
Abstract: We introduce PyTorch Geometric High Order (PyGHO), a library designed for High Order Graph Neural Networks (HOGNNs) built upon PyTorch Geometric (PyG). In contrast to ordinary Message Passing Neural Networks (MPNNs) which facilitate message exchange between nodes and are readily implemented using PyG, HOGNNs, encompassing subgraph GNNs and k-WL GNNs, encode node tuples. Such node tuple encoding lacks a universal framework and often necessitates intricate code implementation. The primary objective of PyGHO is to furnish an intuitive and user-friendly interface catering to various HOGNNs. It integrates streamlined data structures for node tuples, offers comprehensive data preprocessing and mini-batch data loading utilities, presents a versatile framework for high order message propagation, and encompasses a repertoire of representative high order GNN methodologies. In this work, we present an detailed overview of the PyGHO library, elucidating its features, and undertake a comparative analysis of existing HOGNNs implemented with PyGHO on real-world tasks.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 4313
Loading