On the Sharp Input-Output Analysis of Nonlinear Systems under Adversarial Attacks

ICLR 2026 Conference Submission13064 Authors

18 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Nonlinear System Identification, Input-Output Analysis, Adversarial Attacks
TL;DR: We study the identification of the input-output mapping of general nonlinear dynamical systems under adversarial attacks.
Abstract: This paper is concerned with learning the input-output mapping of general nonlinear dynamical systems. While the existing literature focuses on Gaussian inputs and benign disturbances, we significantly broaden the scope of admissible control inputs and allow correlated, nonzero-mean, adversarial disturbances. With our reformulation as a linear combination of basis functions, we prove that the $\ell_2$-norm estimator overcomes the challenges as long as the probability that the system is under adversarial attack at a given time is smaller than a certain threshold. We provide an estimation error bound that decays with the input memory length and prove its optimality by constructing a problem instance that suffers from the same bound under adversarial attacks. Our work provides a sharp input-output analysis for a generic nonlinear and partially observed system under significantly generalized assumptions compared to existing works.
Supplementary Material: zip
Primary Area: learning on time series and dynamical systems
Submission Number: 13064
Loading