Track: long paper (up to 10 pages)
Keywords: Legal Reasoning; Causal Reasoning; Key Legal Fact
TL;DR: We improve legal reasoning for judicial subjective questions by explicitly identifying key legal facts before retrieval and answer generation.
Abstract: LLMs are increasingly used in legal tasks, yet they still rely primarily on data-driven learning, associative pattern extraction, and probabilistic generation. While effective in open-domain question answering, this mechanism tends to treat background narratives, superficially relevant details, and legally decisive facts in a similar manner in judicial subjective questions, leading to misplaced reasoning focus, weak rule grounding, and unstable conclusions. Causal research suggests that association is not causation: compared with spurious associations induced by confounding or selection bias, causal relations are generally more interpretable, more robust, and more useful for decision-making. Motivated by this perspective, we propose a causal legal reasoning method for judicial subjective questions centered on key legal fact identification. Instead of generating answers directly from raw case descriptions, our framework decomposes legal reasoning into four components: legal fact extraction, key legal fact identification, rule grounding, and legal judgment generation. A fact is treated as a key legal fact if changing it, while keeping other core conditions approximately fixed, would alter the legal assessment or final conclusion. This intermediate layer enables more targeted legal retrieval, norm application, and answer generation. To support this framework, we construct a task-oriented intermediate representation for judicial subjective questions, including legal facts, key legal facts, rule references, and gold answers. Experiments on the CAIL2025 judicial subjective-question dataset show that the proposed framework achieves strong end-task performance across multiple backbone models. Ablation results further show that both key legal fact identification and retrieval grounding contribute substantially to judicial scoring. These findings suggest that explicit fact-centered reasoning provides a feasible way to improve legal answer generation for complex judicial subjective questions.
Anonymization: This submission has been anonymized for double-blind review via the removal of identifying information such as names, affiliations, and identifying URLs.
Submission Number: 188
Loading