Exploring Open-Domain Fact Verification of Scientific Claims: A Comparative Analysis of Knowledge SourcesDownload PDF

Anonymous

16 Oct 2023ACL ARR 2023 October Blind SubmissionReaders: Everyone
Abstract: The increasing rate at which medical information and health claims are produced and shared online has highlighted the importance of efficient fact verification systems. The usual setting for this task in the literature assumes the documents containing the evidence for claims are already provided and annotated, or they operate over a limited corpus. While this helps improve the reading comprehension abilities of developed systems, it renders them unrealistic for real-world settings where knowledge sources with potentially millions of documents need to be queried to find relevant evidence. In this paper, we perform an array of experiments to test the performance of open-domain fact verification systems. We test the final verdict prediction of systems on four established datasets of biomedical and health-related claims in different settings. While keeping the evidence sentence selection and label prediction parts of the pipeline constant, document retrieval is performed over three common knowledge sources (PubMed, Wikipedia, Google) and using two different information retrieval techniques. We discuss the results, detect important challenges, outline common retrieval patterns, and provide promising future directions.
Paper Type: long
Research Area: NLP Applications
Contribution Types: NLP engineering experiment, Data analysis
Languages Studied: English
Consent To Share Submission Details: On behalf of all authors, we agree to the terms above to share our submission details.
0 Replies

Loading