IBERT: Idiom Cloze-style reading comprehension with Attention
Keywords: BERT, Question Answering
Abstract: Idioms are fixed expressions, often derived from stories, that frequently appear in informal communication and literature. Their meanings are typically non-compositional, making the idiom cloze task a challenging problem in Natural Language Processing (NLP). We propose a BERT-based embedding Seq2Seq framework that models idiomatic phrases while jointly capturing local and global context: XLNet serves as the encoder and RoBERTa selects the most plausible idiom. Experiments on the EPIE Static Corpus show that our approach outperforms existing state-of-the-art methods.
Email Sharing: We authorize the sharing of all author emails with Program Chairs.
Data Release: We authorize the release of our submission and author names to the public in the event of acceptance.
Submission Number: 88
Loading