Abstract: Relational triple extraction is challenging for
its difficulty in capturing rich correlations between entities and relations. Existing works
suffer from 1) heterogeneous representations
of entities and relations, and 2) heterogeneous modeling of entity-entity interactions
and entity-relation interactions. Therefore, the
rich correlations are not fully exploited by
existing works. In this paper, we propose
UniRel to address these challenges. Specifically, we unify the representations of entities
and relations by jointly encoding them within
a concatenated natural language sequence, and
unify the modeling of interactions with a proposed Interaction Map, which is built upon the
off-the-shelf self-attention mechanism within
any Transformer block. With comprehensive experiments on two popular relational
triple extraction datasets, we demonstrate that
UniRel is more effective and computationally
efficient. The source code is available at
https://github.com/wtangdev/UniRel.
0 Replies
Loading