everyone
since 09 May 2025">EveryoneRevisionsBibTeXCC BY 4.0
Large language models (LLMs) exhibit remarkable text-generation capabilities yet struggle with factual consistency in knowledge-intensive tasks. Existing fact-checking methods based on the "Decompose-Then-Verify" paradigm improve factual reliability but face scalability issues due to two main limitations: (1) reliance on costly LLM API calls, and (2) quadratic complexity from pairwise verification of decomposed text segments. We present Light-FS, an efficient framework adopting a "Decompose-Embed-Interact" paradigm: (1) a small language model (SLM) based decomposer extracts atomic propositions, (2) a specialized Bi-Encoder module generates semantic embeddings, and (3) a multi-feature interaction module performs embedding-based verification. Our experiments show that Light-FS achieves 14× faster decomposition than GPT-4o within a 3% F1-drop while delivering a 20× efficiency gain over NLI-based fact-checking models with comparable verification performance. Light-FS provides a scalable and efficient solution for evaluating the factuality of LLM-generated content.