Toggle navigation
OpenReview
.net
Login
×
Go to
ICLR 2023
homepage
MLP-Attention: Improving Transformer Architecture with MLP Attention Weights
Alireza Morsali
,
Moein Heidari
,
Samin Heydarian
,
Tohid Abedini
Published: 01 Jan 2023, Last Modified: 05 Nov 2023
Tiny Papers @ ICLR 2023
Readers:
Everyone
0 Replies
Loading