TradeFM: A Generative Foundation Model for Trade-flow and Market Microstructure

19 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Foundation Model, Generative Model, Market Microstructure, Reinforcement Learning, Algorithmic Trading, Trade-flow, Financial Time Series, Multi-Agent Simulation, Transformer, TradeFM
TL;DR: We introduce TradeFM, a generative Transformer-based foundation model that learns the "language" of financial markets from billions of trades and simulates realistic order-level behavior across assets and liquidity regimes for downstream tasks.
Abstract: Learning generalizable representations from the high-frequency, heterogeneous event streams of financial markets is a significant challenge. We introduce TradeFM, a foundation model that learns the universal dynamics of market microstructure. Pre-trained on billions of equities transactions, TradeFM uses a novel scale-invariant feature representation and a universal tokenization scheme to form a unified representation, enabling generalization without asset-specific calibration. We validate the quality of the learned representations by demonstrating that model-generated rollouts in a closed-loop simulator successfully reproduce canonical stylized facts of financial returns. We robustly evaluate the model’s ability to generalize to temporally and geographically out of sample data, as well as its ability to match real distributions of quantities like log returns and spreads. TradeFM provides a high-fidelity engine for synthetic data generation and downstream agent-based modeling.
Supplementary Material: pdf
Primary Area: foundation or frontier models, including LLMs
Submission Number: 21891
Loading