Unsupervised Context-Driven Question Answering Based on Link Grammar

Published: 01 Jan 2021, Last Modified: 05 Jun 2025AGI 2021EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: While general conversational intelligence (GCI) can be considered one of the core aspects of artificial general intelligence (AGI), there currently exists minimal overlap between the disciplines of AGI and natural language processing (NLP). Only a few AGI architectures can comprehend and generate natural language, and most NLP systems rely either on hardcoded, specialized rules and frameworks that cannot generalize to the various complex domains of human language or on heavily trained deep neural network models that cannot be interpreted, controlled, or made sense of. In this paper, we propose an interpretable “Contextual Generator” architecture for question answering (QA), built as an extension of the recently published “Generator” algorithm for sentence generation, that produces grammatically valid answers to queries structured as lists of seed words. We demonstrate the potential for this architecture to perform automated, closed-domain QA by detailing results on queries from SingularityNET’s “small world” POC-English corpus and from the Stanford Question Answering Dataset. Overall, our work may bring a greater degree of GCI to proto-AGI NLP pipelines. The proposed QA architecture is open-source and can be found on GitHub under the MIT License at https://github.com/aigents/aigents-java-nlp.
Loading