Abstract: Highlights•An efficient attention layer for images based on Approximate Nearest Neighbor Search.•Our layer has a log-linear computational complexity and linear memory complexity.•In contrast standard attention has quadratic computational and memory complexity.•We detail how differentiability is preserved despite the use of nearest neighbors.•Low memory costs permit to adapt SOTA editing algorithms to high-resolution images
Loading