Pre-training Tasks for Embedding-based Large-scale RetrievalDownload PDFOpen Website

2020 (modified: 24 Apr 2023)ICLR 2020Readers: Everyone
Abstract: We consider large-scale retrieval problems such as question answering retrieval and present a comprehensive study of how different sentence level pre-training improving the BERT-style token-level pre-training for two-tower Transformer models.
0 Replies

Loading