Abstract: The ability to analyze the trend of the stock market has always been paid high attention to. A large number of machine learning technologies have been used for stock analysis and prediction. The traditional time series prediction models, including RNN, LSTM and their deformed bodies, show the problems of gradient disappearance and low efficiency in long-span prediction. This paper proposes a long-term and short-term memory network architecture, which based on Encoder and Decoder Stacks and self-attention mechanism, replacing the feature extraction part of traditional LSTM through self-attention mechanism and provides interpretable insights into the dynamics of time. Through the results of simulation experiments, this paper shows the comparison of stock prediction effects through using RNN, Bi-LSTM and Encoder and Decoder-Attention-LSTM models. The experimental task shows that the prediction accuracy of this model is improved by an order of magnitude compared with the traditional LSTM-like model, and can achieve high accuracy when the epoch is small.
Loading