Fine-grained Local Sensitivity Analysis of Standard Dot-Product Self-Attention

23 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: general machine learning (i.e., none of the above)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Self-attention, Vision Transformers (ViT), Local Sensitivity
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: We propose a fine-grained local senstiivity anaylsis of dot-product self-attention with respect to l2-bounded input perturbations.
Abstract: Self-attention has been widely used in various machine learning models, such as vision transformers. The standard dot-product self-attention is arguably the most popular structure, and there is a growing interest in understanding the mathematical properties of such attention mechanisms. This paper presents a fine-grained local sensitivity analysis of the standard dot-product self-attention. Despite the well-known fact that dot-product self-attention is not (globally) Lipschitz, we develop new theoretical local bounds quantifying the effect of input feature perturbations on the attention output. Utilizing mathematical techniques from optimization and matrix theory, our analysis reveals that the local sensitivity of dot-product self-attention to $\ell_2$ perturbations can actually be controlled by several key quantities associated with the attention weight matrices and the unperturbed input. We empirically validate our theoretical findings through several examples, offering new insights for achieving low sensitivity in dot-product self-attention against $\ell_2$ input perturbations.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 8242
Loading