Neural Logic Analogy LearningDownload PDF

Published: 25 Mar 2022, Last Modified: 23 May 2023ICLR 2022 PAIR^2Struct PosterReaders: Everyone
Keywords: Neural Logic Reasoning, Analogy Learning, Letter-String Analogy
TL;DR: Develop neural logic reasoning model for letter-string analogy learning problem
Abstract: Letter-string analogy is an important analogy learning task which seems to be easy for humans but very challenging for machines. The main idea behind current approaches to solving letter-string analogies is to design heuristic rules for extracting analogy structures and constructing analogy mappings. However, one key problem is that it is difficult to build a comprehensive and exhaustive set of analogy structures which can fully describe the subtlety of analogies. This problem makes current approaches unable to handle complicated letter-string analogy problems. In this paper, we propose Neural lOgic ANalogy learning (Noan), which is a dynamic neural architecture driven by differentiable logic reasoning to solve analogy problems. Each analogy problem is converted into logical expressions consisting of logical variables and basic logical operations (AND, OR, and NOT). More specifically, Noan learns the logical variables as vector embeddings and learns each logical operation as a neural module. In this way, the model builds computational graph integrating neural network with logical reasoning to capture the internal logical structure of the input letter strings. The analogy learning problem then becomes a True/False evaluation problem of the logical expressions. Experiments show that our machine learning-based Noan approach outperforms state-of-the-art approaches on standard letter-string analogy benchmark datasets.
0 Replies

Loading