Abstract: Error-Correcting Output Codes (ECOCs) offer aprincipled approach for combining binary classi-fiers into multiclass classifiers. In this paper, westudy the problem of designing optimal ECOCsto achieve both nominal and adversarial accuracyusing Support Vector Machines (SVMs) and bi-nary deep neural networks. We develop a scalableInteger Programming (IP) formulation to designminimal codebooks with desirable error correct-ing properties. Our work leverages the advancesin IP solution techniques to generate codebookswith optimality guarantees. To achieve tractability,we exploit the underlying graph-theoretic struc-ture of the constraint set. Particularly, the size ofthe constraint set can be significantly reduced us-ing edge clique covers. Using this reduction tech-nique along with Plotkin’s bound in coding theory,we demonstrate that our approach is scalable to alarge number of classes. The resulting codebooksachieve a high nominal accuracy relative to stan-dard codebooks (e.g., one-vs-all, one-vs-one, anddense/sparse codes). Interestingly, our codebooksprovide non-trivial robustness to white-box attackswithout any adversarial training.
0 Replies
Loading