Keywords: Discriminability, ReLU networks
TL;DR: We study discriminability of samples using binarized activation values
Abstract: Binarized ReLU activations are considered as a metric space equipped with the Hamming distance. While for two layer ReLU networks with random Gaussian weights it can be shown theoretically
that local metric properties are approximately preserved, we experimentally study the discrimination capability in this Hamming space for deeper ReLU networks and look also at the non-local behavior.
It turns out that the discrimination capability is approximately preserved as expected, but showing small saturation effects that differ from standard metrics based on full activation information.
These effects are explained based on the fact that the binarized activation states induce a tessellation of polyhedral cells in the input space.
1 Reply
Loading