Verification of Neural Control Barrier Functions with Symbolic Derivative Bounds Propagation

Published: 22 Oct 2024, Last Modified: 06 Nov 2024CoRL 2024 Workshop SAFE-ROL PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Learning for control, control barrier function, formal verification15
TL;DR: We propose a new neural CBF verification framework using symbolic derivative bound propagation to give tight and efficient bounds for verifying forward invariance.
Abstract: Control barrier functions (CBFs) are important in safety-critical systems and robot control applications. Neural networks have been used to parameterize and synthesize CBFs with bounded control input for complex systems. However, it is still challenging to verify pre-trained neural networks CBFs (neural CBFs) in an efficient symbolic manner. To this end, we propose a new efficient verification framework for ReLU-based neural CBFs through symbolic derivative bound propagation by combining the linearly bounded nonlinear dynamic system and the gradient bounds of neural CBFs. Specifically, with Heaviside step function form for derivatives of activation functions, we show that the symbolic bounds can be propagated through the inner product of neural CBF Jacobian and nonlinear system dynamics. Through extensive experiments on different robot dynamics, our results outperform the interval arithmetic-based baselines in verified rate and verification time along the CBF boundary, validating the effectiveness and efficiency of the proposed method with different model complexity.
Submission Number: 20
Loading