Fast Implicit Constrained Optimization of Non-decomposable Objectives for Deep NetworksDownload PDF

Published: 20 Oct 2022, Last Modified: 05 May 2023HITY Workshop NeurIPS 2022Readers: Everyone
Keywords: Constrained Optimization, Last Layer Training, Implicit Function Theorem, Faster Training, Non-Decomposable Objectives
TL;DR: We propose new training procedures for a constrained optimization problem involve optimizing for non-decomposable metrics. Our proposed method achieves performance comparable to the existing approach while being computationally efficient.
Abstract: We consider a popular family of constrained optimization problems in machine learning that involve optimizing a non-decomposable objective while constraining another. Different from the previous approach that expresses the classifier thresholds as a function of all model parameters, we consider an alternative strategy where the thresholds are expressed as a function of only a subset of the model parameters, i.e., the last layer of the neural network. We propose new training procedures that optimize for the bottom and last layers separately, and solve them using standard gradient-based methods. Experiments on a benchmark dataset demonstrate our proposed method achieves performance comparable to the existing approach while being computationally efficient.
Supplementary Material: zip
4 Replies

Loading