Lipschitz Continuity in Deep Learning: A Systematic Review of Theoretical Foundations, Estimation Methods, Regularization Approaches and Certifiable Robustness

06 Sept 2025 (modified: 15 Apr 2026)Under review for TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: Lipschitz continuity is a fundamental property of neural networks that characterizes their sensitivity to input perturbations. It plays a pivotal role in deep learning, governing robustness, generalization and optimization dynamics. Despite its importance, research on Lipschitz continuity is scattered across various domains, lacking a unified perspective. This paper addresses this gap by providing a systematic review of Lipschitz continuity in deep learning. We explore its theoretical foundations, estimation methods, regularization approaches, and certifiable robustness. By reviewing existing research through the lens of Lipschitz continuity, this survey serves as a comprehensive reference for researchers and practitioners seeking a deeper understanding of Lipschitz continuity and its implications in deep learning. Code: https://anonymous.4open.science/r/lipschitz_survey-DECE/
Submission Length: Long submission (more than 12 pages of main content)
Assigned Action Editor: ~Aurélien_Bellet1
Submission Number: 5829
Loading