Time-Aware Representation Learning for Time-Sensitive Question AnsweringDownload PDF

Anonymous

17 Apr 2023ACL ARR 2023 April Blind SubmissionReaders: Everyone
Abstract: Time is one of the crucial factors in real-world question answering (QA) problems. However, language models have a problem in understanding the relationships between time specifiers, such as `after' and `before', and numbers, since existing QA datasets do not include a sufficient number of time expressions. To address this issue, we propose a Time-Context dependent Span Extraction (TCSE) task and a time-context dependent data generation framework for model training. Moreover, we present a metric to evaluate the time awareness of the QA model using TCSE. The TCSE task consists of a question and four sentence candidates generated by a pre-defined template. Candidates are classified as correct or incorrect based on time and context. The model is trained to extract the answer span from the sentence that is both correct in time and context. The model trained with TCSE outperforms baseline models up to 6.97 of the F1-score in the TimeQA dataset.
Paper Type: short
Research Area: Question Answering
0 Replies

Loading