Enhancing Entity Linking with Contextualized Entity EmbeddingsOpen Website

2022 (modified: 07 Nov 2022)NLPCC (2) 2022Readers: Everyone
Abstract: Entity linking (EL) in written language domains has been extensively studied, but EL of spoken language is still unexplored. We propose a conceptually simple and highly effective two-stage approach to tackle this issue. The first stage retrieves candidates with a dual encoder, which independently encodes mention context and entity descriptions. Each candidate is then reranked by a LUKE-based cross-encoder, which concatenates the mention and entity description. Different from previous cross-encoder which takes only words as input, our model adds entities into input. Experiments demonstrate that our model does not need large-scale training on Wikipedia corpus, and outperforms all previous models with or without Wikipedia training. Our approach ranks the $$1^\textrm{st}$$ in NLPCC 2022 Shared Task on Speech EL Track 2 (Entity Disambiguation-Only).
0 Replies

Loading