Monotonic Kronecker-Factored LatticeDownload PDF

Published: 12 Jan 2021, Last Modified: 05 May 2023ICLR 2021 PosterReaders: Everyone
Keywords: Theory, Regularization, Algorithms, Classification, Regression, Matrix and Tensor Factorization, Fairness, Evaluation, Efficiency, Machine Learning
Abstract: It is computationally challenging to learn flexible monotonic functions that guarantee model behavior and provide interpretability beyond a few input features, and in a time where minimizing resource use is increasingly important, we must be able to learn such models that are still efficient. In this paper we show how to effectively and efficiently learn such functions using Kronecker-Factored Lattice ($\mathrm{KFL}$), an efficient reparameterization of flexible monotonic lattice regression via Kronecker product. Both computational and storage costs scale linearly in the number of input features, which is a significant improvement over existing methods that grow exponentially. We also show that we can still properly enforce monotonicity and other shape constraints. The $\mathrm{KFL}$ function class consists of products of piecewise-linear functions, and the size of the function class can be further increased through ensembling. We prove that the function class of an ensemble of $M$ base $\mathrm{KFL}$ models strictly increases as $M$ increases up to a certain threshold. Beyond this threshold, every multilinear interpolated lattice function can be expressed. Our experimental results demonstrate that $\mathrm{KFL}$ trains faster with fewer parameters while still achieving accuracy and evaluation speeds comparable to or better than the baseline methods and preserving monotonicity guarantees on the learned model.
One-sentence Summary: We show how to effectively and efficiently learn flexible and interpretable monotonic functions using Kronecker-Factored Lattice, an efficient reparameterization of flexible monotonic lattice regression via Kronecker product.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
13 Replies

Loading