BinaryFormer: 1-bit self-attention for long-range transformers in medical image segmentation and 3D diffusion models
Keywords: self-attention, differentiable binarisation, diffusion, energy efficiency
TL;DR: A differentiable binarisation scheme for 1-bit training and inference of self-attention for very long sequences is proposed and evaluated.
Abstract: Vision transformers excel at capturing long-range interactions and have become essential for many medical image analysis tasks. Their computational cost, however, grows quadratically with sequence length - which is problematic for certain 3D problems, e.g. high-resolution diffusion models that require dozens of sampling steps. Flash attention addressed some limitations by optimising local memory access, but left the computational burden high. Quantising weights and activations for convolutions and fully binary networks are possible, but have to be trained at higher precision and often resulted in performance drops. For transformers recent studies have been limited to quantising weights in linear layers or exploiting the potential of sparsity in self-attention scores.
We present a novel scheme that not only enables a binary precision computation of the self-attention at inference time but also extends this to the training of transformers. To achieve differentiability we combine the bitwise Hamming distance with a learnable scalar query and key weighting. In theory this yields a 16-32x more resource-efficiency in arithmetic operations and memory bandwidth. We evaluate our model on three tasks with sequence lengths of N>1000: classification of images without patch-embedding, semantic 2D MRI segmentation and 3D high-resolution diffusion models for inpainting and synthesis. Our results demonstrate competitive performance and we provide an intuitive reasoning for the effectiveness of differentiable key-, query- weighting through Bernoulli sampling and distance interpolation. https://github.com/mattiaspaul/BinaryFormer
Primary Subject Area: Generative Models
Secondary Subject Area: Segmentation
Paper Type: Methodological Development
Registration Requirement: Yes
Reproducibility: https://github.com/mattiaspaul/BinaryFormer
Submission Number: 120
Loading