Keywords: Mamba; Self-supervised learning; Time Series Prediction
TL;DR: We propose a Repetitive Contrastive Learning method to enhance the sequence selection capability of Mamba SSM, which can bring improvements to multiple Mamba-based models in time series prediction problems.
Abstract: The prediction of long sequences has always been a challenge in time series forecasting tasks. Due to Mamba's sequence selection capability, many Mamba-based models have been proposed, achieving state-of-the-art results in long sequence prediction problems. However, much research has focused on integrating mamba-ssm into specific model structures for better performance, while the core of mamba-ssm, its sequence selection capability, has not been deeply explored. We believe there is significant potential in Mamba's sequence selection capability and propose a Repetitive Contrastive Learning (RCL) method to enhance it. Specifically, we utilize Repeating Sequence Augmentation to expand the sequence while introducing Gaussian noise, thereby enhancing the Mamba block's sequence selection capability through both inter-sequence and intra-sequence contrastive techniques. Then our methods transfer parameters directly from a single pretrained Mamba block to a variety of Mamba-based models. This approach provides superior initialization for forecasting tasks. Our experiments consistently demonstrate that this technique improves the forecasting performance of many Mamba-based models, without imposing additional memory requirements.
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 9847
Loading