DynaDojo: An Extensible Benchmarking Platform for Scalable Dynamical System Identification

Published: 26 Sept 2023, Last Modified: 18 Jan 2024NeurIPS 2023 Datasets and Benchmarks PosterEveryoneRevisionsBibTeX
Keywords: system identification, benchmarking, scaling
TL;DR: We unveil DynaDojo, an extensible benchmarking platform for advancing system identification by evaluating any learning algorithm's sample efficiency, scalability, and generalizability across diverse dynamical systems.
Abstract: Modeling complex dynamical systems poses significant challenges, with traditional methods struggling to work across a variety of systems and scale to high-dimensional dynamics. In response, we present DynaDojo, a novel benchmarking platform designed for data-driven dynamical system identification. DynaDojo enables comprehensive evaluation of how an algorithm's performance scales across three key dimensions: (1) the number of training samples provided, (2) the complexity of the dynamical system being modeled, and (3) the training samples required to achieve a target error threshold. Furthermore, DynaDojo enables studying out-of-distribution generalization (by providing multiple test conditions for each system) and active learning (by supporting closed-loop control). Through its user-friendly and easily extensible API, DynaDojo accommodates a wide range of user-defined $\texttt{Algorithms}$, $\texttt{Systems}$, and $\texttt{Challenges}$ (scaling metrics). The platform also prioritizes resource-efficient training for running on a cluster. To showcase its utility, in DynaDojo $\texttt{0.9}$, we include implementations of 7 baseline algorithms and 20 dynamical systems, along with many demo notebooks. This work aspires to make DynaDojo a unifying benchmarking platform for system identification, paralleling the role of OpenAI’s Gym in reinforcement learning.
Dataset Url: https://dynadojo.github.io/dynadojo/
Submission Number: 568
Loading