Open Peer Review. Open Publishing. Open Access. Open Discussion. Open Directory. Open Recommendations. Open API. Open Source.
Gaussian Attention Model and Its Application to Knowledge Base Embedding and Question Answering
Liwen Zhang, John Winn, Ryota Tomioka
Nov 04, 2016 (modified: Nov 30, 2016)ICLR 2017 conference submissionreaders: everyone
Abstract:We propose the Gaussian attention model for content-based neural memory
access. With the proposed attention model, a neural network has the
additional degree of freedom to control the focus of its attention from
a laser sharp attention to a broad attention. It is applicable whenever
we can assume that the distance in the latent space reflects some notion
of semantics. We use the proposed attention model as a scoring function
for the embedding of a knowledge base into a continuous vector space and
then train a model that performs question answering about the entities
in the knowledge base. The proposed attention model can handle both the
propagation of uncertainty when following a series of relations and also
the conjunction of conditions in a natural way. On a dataset of soccer
players who participated in the FIFA World Cup 2014, we demonstrate that
our model can handle both path queries and conjunctive queries well.
TL;DR:We make (simple) knowledge base queries differentiable using the Gaussian attention model.
Keywords:Natural language processing, Supervised Learning, Deep learning
Enter your feedback below and we'll get back to you as soon as possible.