Keywords: mathematics, Lyapunov, transformers, control, AI for science, AI for maths, reasoning
TL;DR: Transformers can be trained from synthetic data to find Lyapunov functions, a long-standing open problem in mathematics
Abstract: Despite their spectacular progress, language models still struggle on complex reasoning tasks, such as advanced mathematics.
We consider a long-standing open problem in mathematics: discovering a Lyapunov function that ensures the global stability of a dynamical system. This problem has no known general solution, and algorithmic solvers only exist for some small polynomial systems.
We propose a new method for generating synthetic training samples from random solutions, and show that sequence-to-sequence transformers trained on such datasets perform better than algorithmic solvers and humans on polynomial systems, and can discover new Lyapunov functions for non-polynomial systems.
Primary Area: Machine learning for other sciences and fields
Submission Number: 19856
Loading