Abstract: Graph databases (GDBs) enable processing and analysis of unstructured, complex,
rich, and usually vast graph datasets. Despite the large significance of GDBs
in both academia and industry, little effort has been made into integrating
them with the predictive power of graph neural networks (GNNs). In this work,
we show how to seamlessly combine nearly any GNN model with the computational
capabilities of GDBs. For this, we observe that the majority of these systems
are based on a graph data model called the Labeled Property Graph (LPG), where
vertices and edges can have arbitrarily complex sets of labels and properties.
We then develop LPG2vec, an encoder that transforms an arbitrary LPG dataset
into a representation that can be directly used with a broad class of GNNs,
including convolutional, attentional, message-passing, and even higher-order or
spectral models. In our evaluation, we show that the rich information
represented as LPG labels and properties is properly preserved by LPG2vec, and
it increases the accuracy of predictions regardless of the targeted learning
task or the used GNN model, by up to 34% compared to graphs with no LPG
labels/properties. In general, LPG2vec enables combining predictive power of
the most powerful GNNs with the full scope of information encoded in the LPG
model, paving the way for neural graph databases, a class of systems where the
vast complexity of maintained data will benefit from modern and future graph
machine learning methods.
Type Of Submission: Full paper proceedings track submission (max 9 main pages).
TL;DR: We illustrate how to harness the predictive power of Graph Neural Networks in the context of Graph Databases based on the Labeled Property Graph data model, which is used in the majority of leading graph database systems in the industry
PDF File: pdf
Supplementary Materials: zip
Type Of Submission: Full paper proceedings track submission.
Poster: png
Poster Preview: png
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2209.09732/code)
6 Replies
Loading