Learning to Compute Gröbner Bases

20 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: generative models
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Transformer; Gröbner bases; Computational algebra
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: Solving a polynomial system, or computing an associated Gröbner basis, has been a fundamental task in computational algebra. However, it is also known for its notoriously expensive computational cost—doubly exponential time complexity in the number of variables in the worst case. In this paper, we achieve for the first time Gröbner basis computation through the training of a transformer. The training requires many pairs of a polynomial system and the associated Gröbner basis, thus motivating us to address two novel algebraic problems: random generation of Gröbner bases and the transformation of them into non-Gröbner polynomial systems, termed as *backward Gröbner problem*. We resolve these problems with zero-dimensional radical ideals, the ideals appearing in various applications. The experiments show that in the five-variate case, the proposed dataset generation method is five orders of magnitude faster than a naive approach, overcoming a crucial challenge in learning to compute Gröbner bases.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 2532
Loading