Hierarchical Lattice Layer for Partially Monotone Neural NetworksDownload PDF

Published: 31 Oct 2022, Last Modified: 28 Dec 2022NeurIPS 2022 AcceptReaders: Everyone
Keywords: partially monotone regression, partially monotone neural networks, monotonicity constraints
TL;DR: Construction of a partially monotone neural network layer, which uses a small amount of memory
Abstract: Partially monotone regression is a regression analysis in which the target values are monotonically increasing with respect to a subset of input features. The TensorFlow Lattice library is one of the standard machine learning libraries for partially monotone regression. It consists of several neural network layers, and its core component is the lattice layer. One of the problems of the lattice layer is that it requires the projected gradient descent algorithm with many constraints to train it. Another problem is that it cannot receive a high-dimensional input vector due to the memory consumption. We propose a novel neural network layer, the hierarchical lattice layer (HLL), as an extension of the lattice layer so that we can use a standard stochastic gradient descent algorithm to train HLL while satisfying monotonicity constraints and so that it can receive a high-dimensional input vector. Our experiments demonstrate that HLL did not sacrifice its prediction performance on real datasets compared with the lattice layer.
Supplementary Material: pdf
14 Replies