Toggle navigation
OpenReview
.net
Login
×
Go to
ICML 2021
homepage
Relative Positional Encoding for Transformers with Linear Complexity
Antoine Liutkus
,
Ondrej Cífka
,
Shih-Lun Wu
,
Umut Simsekli
,
Yi-Hsuan Yang
,
Gaël Richard
2021 (modified: 21 Sept 2022)
ICML 2021
Readers:
Everyone
Abstract:
Recent advances in Transformer models allow for unprecedented sequence lengths, due to linear space and time complexity. In the meantime, relative positional encoding (RPE) was proposed as benefici...
0 Replies
Loading