Abstract: Highlights•This paper improves the GRU by replacing the reset gate with an attention mechanism.•We propose a multi-head cross-attention to learn and integrate market latent states.•Empirical studies show the method outperforms the latest existing approaches.
Loading