Mention Memory: incorporating textual knowledge into Transformers through entity mention attentionDownload PDF

29 Sept 2021, 00:32 (edited 15 Mar 2022)ICLR 2022 PosterReaders: Everyone
Keywords:
Abstract:
One-sentence Summary:
Supplementary Material: zip
17 Replies

Loading