- Keywords: deep learning, relational model, knowledge graph, exchangeability, equivariance
- TL;DR: We propose a feed-forward layer that is informed by the ER model of relational data and show that it is the most expressive linear layer possible under given the equivariance constraints.
- Abstract: Due to its extensive use in databases, the relational model is ubiquitous in representing big-data. However, recent progress in deep learning with relational data has been focused on (knowledge) graphs. In this paper we propose Equivariant Entity-Relationship Networks, the class of parameter-sharing neural networks derived from the entity-relationship model. We prove that our proposed feed-forward layer is the most expressive linear layer under the given equivariance constraints, and subsumes recently introduced equivariant models for sets, exchangeable tensors, and graphs. The proposed feed-forward layer has linear complexity in the the data and can be used for both inductive and transductive reasoning about relational databases, including database embedding, and the prediction of missing records. This, provides a principled theoretical foundation for the application of deep learning to one of the most abundant forms of data.
- Code: https://anonymous.4open.science/r/10bd3f10-0c97-42ac-b2f1-d21a838b007f/