Toggle navigation
OpenReview
.net
Login
×
Go to
ICML 2022
homepage
What Dense Graph Do You Need for Self-Attention?
Yuxin Wang
,
Chu-Tak Lee
,
Qipeng Guo
,
Zhangyue Yin
,
Yunhua Zhou
,
Xuanjing Huang
,
Xipeng Qiu
2022 (modified: 18 Apr 2023)
ICML 2022
Readers:
Everyone
Abstract:
Transformers have made progress in miscellaneous tasks, but suffer from quadratic computational and memory complexities. Recent works propose sparse transformers with attention on sparse graphs to ...
0 Replies
Loading