Incorporating Neural ODEs into DAE-Constrained Optimization Problems

25 Sept 2024 (modified: 27 Nov 2024)ICLR 2025 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Differential Algebraic Equations, Neural Ordinary Differential Equations, Dynamic Optimization, Hybrid Modeling
TL;DR: This paper presents a novel approach to solving DAE-constrained optimization problems by incorporating Neural Ordinary Differential Equations, demonstrating enhanced performance through realistic case studies in various dynamic systems.
Abstract: Differential algebraic equations (DAEs) are pivotal in dynamic optimization across diverse fields, from process control to flight trajectory optimization and epidemiological modeling. Traditional methods like single shooting, multiple shooting, and direct transcription effectively optimize known mechanistic models. However, significant challenges arise when the underlying equations are unknown or deviate from empirical data. While black-box optimization strategies can address some issues, challenges persist regarding data quality, non-linearity, and the inclusion of constraints. Recent advances in machine learning, particularly Neural ODEs, offer promising tools for continuous representation of dynamic systems. This work bridges the gap between machine learning representations of dynamic systems and optimization methodologies, enabling a novel approach for solving DAEs with data-driven components. We demonstrate this approach using numerical examples of DAE problems and realistic case studies, including biochemical reactor control and disease spread prevention. Our results highlight the efficacy of incorporating Neural ODEs into equation-based solvers, showing improved performance over existing strategies such as SINDy. Additionally, we formalize the optimization program for NN-embedded DAEs and present representations for common neural network architectures (e.g., ReLU, tanh). This work contributes a novel framework for dynamic system optimization, integrating machine learning advancements with traditional optimization techniques, and offers practical insights through comprehensive case studies.
Primary Area: optimization
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 5234
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview