Keywords: graph embedding, probabilistic model, Bayesian nonparametrics, representation learning
TL;DR: A novel Bayesian nonparametric embedding model for multiple representation learning for graph data
Abstract: In graph data, each node often serves multiple functionalities. However, most graph embedding models assume that each node can only possess one representation. We address this issue by proposing a nonparametric graph embedding model. The model allows each node to learn multiple representations where they are needed to represent the complexity of random walks in the graph. It extends the Exponential family graph embedding model with two nonparametric prior settings, the Dirichlet process, and the uniform process. The model combines the ability of Exponential family graph embedding to take the number of occurrences of context nodes into account with nonparametric priors giving it the flexibility to learn more than one latent representation for each node. The learned embedding outperforms other state-of-the-art approaches in node classification and link prediction tasks.
Supplementary Material: zip
5 Replies
Loading