Image BERT Pre-training with Online TokenizerDownload PDFOpen Website

2022 (modified: 10 Nov 2022)ICLR 2022Readers: Everyone
Abstract: The success of language Transformers is primarily attributed to the pretext task of masked language modeling (MLM), where texts are first tokenized into semantically meaningful pieces. In this work...
0 Replies

Loading