Derivation Prompting: A Logic-Based Method for Improving Retrieval-Augmented Generation

Published: 2024, Last Modified: 05 Jan 2026IBERAMIA 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: The application of Large Language Models to Question Answering has shown great promise, but important challenges such as hallucinations and erroneous reasoning arise when using these models, particularly in knowledge-intensive, domain-specific tasks. To address these issues, we introduce Derivation Prompting, a novel prompting technique for the generation step of the Retrieval-Augmented Generation framework. Inspired by logic derivations, this method involves deriving conclusions from initial hypotheses through the systematic application of predefined rules. It constructs a derivation tree that is interpretable and adds control over the generation process. We applied this method in a specific case study, significantly reducing unacceptable answers compared to traditional RAG and long-context window methods.\(^{1}\)(Repo with all prompts: https://github.com/nsuruguay05/derivation-prompting)
Loading