Metric Transform: Exploring beyond Affine Transform for Neural NetworksDownload PDF

16 Mar 2023 (modified: 11 Apr 2023)Submitted to Tiny Papers @ ICLR 2023Readers: Everyone
Keywords: Metric/Distance/Similarity as Neuron, Visualization of high dimension data, Interpretable Neurons
TL;DR: Neural Networks with Linear Lransformation can work with Metric Transformation such as l1-norm, l2-norm and generalizes to other functions as well. Locality of metrics provides a basis for Interpretation.
Abstract: Artificial Neural Networks(ANN) of varying architectures are generally paired with linear transformation at the core. However, we find dot product neurons with global influence less interpretable as compared to a more local influence of euclidean distance (as used in RBF). In this work, we explore the generalization of dot product neurons to lp-norm, metrics, and beyond. We find such metrics as transform performs similarly to affine transform when used in MLP or CNN. Furthermore, we use distance/similarity measuring neurons to interpret and explain input data, overfitting and Residual MLP. We share our code in github.
4 Replies

Loading