Toggle navigation
OpenReview
.net
Login
×
Go to
EMNLP 2022
homepage
How Much Does Attention Actually Attend? Questioning the Importance of Attention in Pretrained Transformers
Michael Hassid
,
Hao Peng
,
Daniel Rotem
,
Jungo Kasai
,
Ivan Montero
,
Noah A. Smith
,
Roy Schwartz
Published: 01 Jan 2022, Last Modified: 12 May 2023
EMNLP (Findings) 2022
Readers:
Everyone
0 Replies
Loading