DLCNet: Enabling Long-Range Convolution with Data Dependency

23 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Supplementary Material: zip
Primary Area: representation learning for computer vision, audio, language, and other modalities
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Long-Range, Convolution, Data-Dependent, LLM
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: Data-Dependent Long-Range Convolution
Abstract: In recent years, the Transformer architecture and self-attention mechanism have become the first choice for sequence modeling, but they encounter significant computational challenges when processing lengthy sequences. Long-range convolution has emerged as a promising alternative to self-attention, offering distinct advantages across various domains. However, current long-range convolution architectures still face several issues, such as excessive parameter usage and limited in-context learning capabilities. To tackle these challenges, we propose a Data-dependent Long-range Convolution Network (DLCNet) that introduces data dependency through three key modules: Layer-Wise Mapping, Rectify SideNet, and SWEAP Operator. DLCNet effectively facilitates in-context learning within a reasonable parameter scale. Extensive experiments have demonstrated that DLCNet surpasses the state-of-the-art baselines in processing lengthy sequences, even when trained with short sequences.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 6816
Loading