Comprehensive is not perfect: Enhancing LLMs with Expert Notes

ACL ARR 2024 December Submission2341 Authors

16 Dec 2024 (modified: 05 Feb 2025)ACL ARR 2024 December SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Large Language Models (LLMs) are increasingly employed in specialized fields, such as the legal domain, where expert knowledge is essential to overcome their inherent limitations. However, acquiring comprehensive expert knowledge is often costly and impractical. To mitigate the reliance, researchers have explored leveraging fragmented expert insights to help LLMs mimic expert reasoning. However, such approaches often lack the practical experience that expert provides. In this paper, we introduce a novel form of expert experience: Notes-type Knowledge. It is less formalized and precise but is more accessible and contains the practical expertise often missing in LLMs. Focusing on the Four-element Theory (FET) in Chinese criminal law, we annotate the four elements knowledge in notes-type for 194 charges, and purpose a Notes-guided LLM method to integrate LLMs with notes-type knowledge. Experiments on Similar Charge Disambiguation and Legal Case Retrieval tasks show that the approach outperforms LLMs and achieves performance comparable to that with comprehensive expert knowledge.
Paper Type: Long
Research Area: NLP Applications
Research Area Keywords: NLP Applications
Contribution Types: Model analysis & interpretability, Approaches to low-resource settings, Data resources, Data analysis
Languages Studied: Chinese
Submission Number: 2341
Loading