Recurrent model for Sequential reasoning

18 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Recurrent Model, Sequencial Reasoning, Test time scaling
Abstract: We propose a recurrent architecture designed to extend test-time scaling capabilities to sequential input streams. By interleaving fast, iterative reasoning loops between slow observation updates, our method facilitates dynamic compression of latent representations, where internal states self-organize into stable clusters that persist and evolve alongside the input. This mechanism allows the model to maintain coherent representations over long horizons, significantly improving out-of-distribution generalization in reinforcement learning and algorithmic tasks compared to standard sequential baselines such as LSTM, state space models, and Transformer variants. Code: https://anonymous.4open.science/r/fastslow-81DB
Primary Area: other topics in machine learning (i.e., none of the above)
Submission Number: 11517
Loading