Packing: Towards 2x NLP BERT AccelerationDownload PDF

21 May 2021 (modified: 22 Oct 2023)NeurIPS 2021 SubmittedReaders: Everyone
Keywords: deep learning, BERT, IPU, GPU, hardware-acceleration, padding, Wikipedia, NLP
TL;DR: We eliminate padding overhead in BERT with our proposed packing algorithms that combine sequences to achieve 2x speed-up on the IPU.
Abstract: We find that at sequence length 512 padding tokens represent in excess of $50\%$ of the Wikipedia dataset used for pretraining BERT (Bidirectional Encoder Representations from Transformers). Therefore by removing all padding we achieve a 2x speed-up in terms of sequences/sec. To exploit this characteristic of the dataset, we develop and contrast two deterministic packing algorithms. Both algorithms rely on the assumption that sequences are interchangeable and therefore packing can be performed on the histogram of sequence lengths, rather than per sample. This transformation of the problem leads to algorithms which are fast and have linear complexity in dataset size. The shortest-pack-first histogram-packing (SPFHP) algorithm determines the packing order for the Wikipedia dataset of over $16$M sequences in $0.02$ seconds. The non-negative least-squares histogram-packing (NNLSHP) algorithm converges in $28.4$ seconds but produces solutions which are more depth efficient, managing to get near optimal packing by combining a maximum of $3$ sequences in one sample. Using the dataset with multiple sequences per sample requires additional masking in the attention layer and a modification of the MLM loss function. We demonstrate that both of these changes are straightforward to implement and have relatively little impact on the achievable performance gain on modern hardware. Finally, we pretrain BERT-Large using the packed dataset, demonstrating no loss of convergence and the desired 2x speed-up.
Code Of Conduct: I certify that all co-authors of this work have read and commit to adhering to the NeurIPS Statement on Ethics, Fairness, Inclusivity, and Code of Conduct.
Supplementary Material: zip
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2107.02027/code)
16 Replies

Loading