GauSE: Gaussian Enhanced Self-Attention for Event ExtractionDownload PDF

Anonymous

16 Nov 2021 (modified: 05 May 2023)ACL ARR 2021 November Blind SubmissionReaders: Everyone
Abstract: Event Extraction (EE) has benefited from pre-trained language models (PLMs), in which the self-attention mechanism could pay attention to the global relationship between triggers/arguments and context words to enhance performance. However, existing PLM-based methods are not good enough at capturing local trigger/argument-specific knowledge. To this end, we propose a Gaussian enhanced Self-attention Event extraction framework (GauSE), which models the syntactic-related local information of trigger/argument as a Gaussian bias for the first time, to pay more attention to the syntactic scope of the local region. Furthermore, existing methods rarely consider multiple occurrences of the same triggers/arguments in EE. We explore the global interaction strategies among multiple localness of the same triggers/arguments to fuse the corresponding distributions and capture more latent information scopes. Compared to traditional GCN-based models, our methods could introduce syntactic relationships without over-smoothing problem in deep GCN layers. Experiments on EE datasets demonstrate the effectiveness and generalization of our proposed approach.
0 Replies

Loading