Keywords: ML, Time-Series/Data Streams, Financial Data
TL;DR: We study negative transfer and architecture choices for training transformer-based models for financial time-series data.
Abstract: Time-series data is a vital modality within data science communities, particularly in financial applications, where it helps in detecting patterns, understanding market behavior, and making informed financial decisions based on historical data. Recent advances in language modeling have led to the rise of time-series pre-trained models that are trained on vast collections of datasets and applied to diverse tasks across financial domains. However, across financial applications, existing time-series pre-trained models have not shown promising performance boost over simple finance benchmarks in both zero-shot and fine-tuning settings. This phenomenon occurs because of a i) lack of financial data within the pre-training stage, and ii) the negative transfer effect due to inherently different time-series patterns across domains. Furthermore, time-series data is continuous, noisy, and can be collected at varying frequencies and different lags across variables, making this data more challenging to model than languages. To address the above problems, we introduce a Pre-trained MoDEL for FINance TimE-series (Delphyne). Delphyne achieves competitive performance to existing foundation and full-shot models with few fine-tuning steps on publicly available datasets, and also shows superior performances on various financial tasks.
Submission Number: 45
Loading