Open Peer Review. Open Publishing. Open Access. Open Discussion. Open Directory. Open Recommendations. Open API. Open Source.
A Neural-Symbolic Approach to Natural Language Tasks
Qiuyuan Huang, Paul Smolensky, Xiaodong He, Li Deng, Dapeng Wu
Feb 15, 2018 (modified: Feb 15, 2018)ICLR 2018 Conference Blind Submissionreaders: everyoneShow Bibtex
Abstract:Deep learning (DL) has in recent years been widely used in natural
language processing (NLP) applications due to its superior
performance. However, while natural languages are rich in
grammatical structure, DL has not been able to explicitly
represent and enforce such structures. This paper proposes a new
architecture to bridge this gap by exploiting tensor product
representations (TPR), a structured neural-symbolic framework
developed in cognitive science over the past 20 years, with the
aim of integrating DL with explicit language structures and rules.
We call it the Tensor Product Generation Network
(TPGN), and apply it to image captioning. The key
ideas of TPGN are: 1) unsupervised learning of
role-unbinding vectors of words via a TPR-based deep neural
network, and 2) integration of TPR with typical DL architectures
including Long Short-Term Memory (LSTM) models. The novelty of our
approach lies in its ability to generate a sentence and extract
partial grammatical structure of the sentence by using
role-unbinding vectors, which are obtained in an unsupervised
manner. Experimental results demonstrate the effectiveness of the
TL;DR:This paper is intended to develop a tensor product representation approach for deep-learning-based natural language processinig applications.