Inverse classification with logistic and softmax classifiers: efficient optimization

Published: 18 Mar 2026, Last Modified: 18 Mar 2026Accepted by TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: In recent years, a certain type of problems have become of interest where one wants to query a trained classifier. Specifically, one wants to find the closest instance to a given input instance such that the classifier's predicted label is changed in a desired way. Examples of these "inverse classification"' problems are counterfactual explanations, adversarial examples and model inversion. All of them are fundamentally optimization problems over the input instance vector involving a fixed classifier, and it is of interest to achieve a fast solution for interactive or real-time applications. We focus on solving this problem efficiently with the squared Euclidean distance for two of the most widely used classifiers: logistic regression and softmax classifier. Owing to special properties of these models, we show that the optimization can be solved in closed form for logistic regression, and iteratively but extremely fast for the softmax classifier. This allows us to solve either case exactly (to nearly machine precision) in a runtime of milliseconds to around a second even for very high-dimensional instances and many classes.
Submission Type: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: Thanks for handling our submission. We have addressed your comments in two new paragraphs: paragraph 2 in section 3.1 (Definition of the optimization problem) and paragraph 1 in section 3.6 (Discussion). Note we have created a new section title in 3.1, which has increased the section numbers within section 3 by one.
Code: https://faculty.ucmerced.edu/mcarreira-perpinan
Assigned Action Editor: ~Dennis_Wei1
Submission Number: 5951
Loading