TS-BERT: A fusion model for Pre-trainning Time Series-Text RepresentationsDownload PDF

Published: 28 Jan 2022, Last Modified: 13 Feb 2023ICLR 2022 SubmittedReaders: Everyone
Keywords: Time Series-Text Representations, Pre-training, Mutilmodal
Abstract: There are many tasks to use news text information and stock data to predict the crisis. In the existing research, the two usually play one master and one follower in the prediction task. Use one of the news text and the stock data as the primary information source for the prediction task and the other as the auxiliary information source. This paper proposes a fusion model for pre-training time series-Text representations, in which news text and stock data have the same status and are treated as two different modes to describe crises. Our model has achieved the best results in the task of predicting financial crises.
One-sentence Summary: We apply multimodal learning to financial crisis prediction, creatively treating text and time-series data as different modes of financial crisis.
4 Replies

Loading