LASS-ODE: When Large Foundation Models Meet Small Unified ODE Representations

ICLR 2026 Conference Submission22183 Authors

20 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Foundation model, continuous-time ODE data, interpolation, extrapolation, few-shot inference, generalization
TL;DR: We introduce Lass ODE, a foundation model for continuous-time dynamical systems. It introduces tokenized linear ODE representations and a hybrid attention mechanism with a dynamic hub for inter-system knowledge sharing.
Abstract: Foundation models have transformed language and vision through large-scale attention over discrete tokens, yet progress on continuous-time dynamical signals remains limited. A core challenge is the absence of a natural token-based representation for ODE trajectories, which evolve continuously, span multiple temporal resolutions, and are often partially observed. We introduce the Tokenized ODE Representation (TOR), which maps trajectories into latent tokens governed by local and linear Neural ODEs, leveraging their linearity for efficient scaling. To capture both temporal context and shared structure across systems, we design a hybrid attention architecture that alternates intra-system self-attention, modeling dependencies within each trajectory, and inter-system cross-attention, supported by a Dynamic ODE Hub (DOH) that serves as a shared repository for inter-system knowledge exchange. These components form LASS-ODE (LArge-Scale Small ODE), a foundation model with strong capacity for interpolation, extrapolation, probabilistic inference, and few-shot generalization across diverse ODE systems.
Supplementary Material: zip
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Submission Number: 22183
Loading