Abstract: We present a formal language with expressions denoting general symbol structures and queries which access information in those structures. A sequence-to-sequence network processing this language learns to encode symbol structures and query them. The learned representation (approximately) shares a simple linearity property with theoretical techniques for performing this task.
TL;DR: We present an expression language for representing and querying structured symbolic information and use it to train an encoder decoder network to represent the expressions as fixed size decodable embedding vectors.
Keywords: Knowledge, Representation, Embedding, Symbolic Structure, Encoder Decoder, Deep Learning, Vector Space
4 Replies
Loading