Differentiable Programming for Piecewise Polynomial FunctionsDownload PDF

28 Sept 2020 (modified: 05 May 2023)ICLR 2021 Conference Withdrawn SubmissionReaders: Everyone
Keywords: Differentiable Programming, piecewise polynomial regression, generative models, segmentation
Abstract: The paradigm of differentiable programming has considerably enhanced the scope of machine learning via the judicious use of gradient-based optimization. However, standard differentiable programming methods (such as autodiff) typically require that the models be differentiable, limiting their applicability. We introduce a new, principled approach to extend gradient-based optimization to piecewise smooth models, such as k-histograms, splines, and segmentation maps. We derive an accurate form to the weak Jacobian of such functions, and show that it exhibits a block-sparse structure that can be computed implicitly and efficiently. We show that using the redesigned Jacobian leads to improved performance in applications such as denoising with piecewise polynomial regression models, data-free generative model training, and image segmentation.
One-sentence Summary: We propose a novel approach to calculate weak Jacobians for piecewise polynomial functions, thus enabling their use in general differentiable programs.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Reviewed Version (pdf): https://openreview.net/references/pdf?id=6CGO8SKCCo
5 Replies

Loading