TradeFM: A Generative Foundation Model for Trade-flow and Market Microstructure

ICLR 2026 Conference Submission21891 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Foundation Model, Generative Model, Market Microstructure, Reinforcement Learning, Algorithmic Trading, Trade-flow, Financial Time Series, Multi-Agent Simulation, Transformer, TradeFM
TL;DR: We introduce TradeFM, a generative Transformer-based foundation model that learns the "language" of financial markets from billions of trades and simulates realistic order-level behavior across assets and liquidity regimes for downstream tasks.
Abstract: Learning generalizable representations from the high-frequency, heterogeneous event streams of financial markets is a significant challenge. We introduce TradeFM, a foundation model that learns the universal dynamics of market microstructure. Pre-trained on billions of equities transactions, TradeFM uses a novel scale-invariant feature representation and a universal tokenization scheme to form a unified representation, enabling generalization without asset-specific calibration. We validate the quality of the learned representations by demonstrating that model-generated rollouts in a closed-loop simulator successfully reproduce canonical stylized facts of financial returns. TradeFM provides a high-fidelity engine for synthetic data generation and downstream agent-based modeling.
Supplementary Material: pdf
Primary Area: foundation or frontier models, including LLMs
Submission Number: 21891
Loading