EventBERTDownload PDF

Anonymous

16 Nov 2021 (modified: 05 May 2023)ACL ARR 2021 November Blind SubmissionReaders: Everyone
Abstract: Pre-trained language models (PrLMs) have shown impressive performance in natural language understanding. However, they mainly rest on extracting context-sensitive statistical patterns without explicit modeling of linguistic information such as semantic relationships entailed in natural language. In this work, we propose EventBERT, an event-based semantic representation model that takes BERT as the backbone and refines with event-based structural semantics in terms of graph convolution network. EventBERT benefits simultaneously from rich event-based structures embodied in the graph and contextual semantics learned in pre-trained model BERT. Experimental results on the GLUE benchmark show the effectiveness.
0 Replies

Loading