OCET: One-Dimensional Convolution Embedding Transformer for Stock Trend Prediction

Published: 01 Jan 2022, Last Modified: 11 Feb 2025BIC-TA 2022EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Due to the strong data fitting ability of deep learning, the use of deep learning for quantitative trading has gradually sprung up in recent years. As a classical problem of quantitative trading, Stock Trend Prediction (STP) mainly predicts the movement of stock price in the future through the historical price information to better guide quantitative trading. In recent years, some deep learning work has made great progress in STP by effectively grasping long-term timing information. However, as a kind of real-time series data, short-term timing information is also very important, because stock trading is high-frequency and price fluctuates violently. And with the popularity of Transformer, there is a lack of an effective combination of feature extraction and Transformer in STP tasks. To make better use of short term information, we propose One-dimensional Convolution Embedding (OCE). Simultaneously, we introduce effective feature extraction with Transformer into STP problem to extract feature information and capture long-term timing information. By combining OCE and Transformer organically, we propose a noval STP prediction model, One-dimensional Convolution Embedding Transformer (OCET), to capture long-term and short-term time series information. Finally, OCET achieves a highest accuracy up to 0.927 in public benchmark FI-2010 When reasoning speed is twice that of SOTA models and a highest accuracy of 0.426 in HKGSAS-2020. Empirical results on these two datasets show that our OCET is significantly superior than other algorithms in STP tasks. Code are available at https://github.com/langgege-cqu/OCET.
Loading