Relative Positional Encoding for Transformers with Linear ComplexityDownload PDFOpen Website

2021 (modified: 21 Sept 2022)ICML 2021Readers: Everyone
Abstract: Recent advances in Transformer models allow for unprecedented sequence lengths, due to linear space and time complexity. In the meantime, relative positional encoding (RPE) was proposed as benefici...
0 Replies

Loading