Supplementary Material: pdf
Track: Extended Abstract Track
Keywords: representational alignment, neural alignment, rsa, representational similarity analysis, similarity, trsa, topological representational similarity analysis
TL;DR: SoftStep is a module for learning instance-dependent similarity functions for broad use in machine learning.
Abstract: Deep learning systems which make use of similarities between samples mostly rely on fixed or global measures, such as cosine distance and kernels, to compare data points or neural representations. To address this inflexibility, we introduce SoftStep, a module for learning instance-wise similarity measures directly from data. SoftStep maps raw similarity scores into context-sensitive values in the closed unit interval, interpolating smoothly between hard rejection and full preservation of neighbors. Unlike existing approaches such as contrastive learning, sparse attention, or hand-designed geotopological (GT) transforms, SoftStep provides a flexible and learnable mechanism that adapts to the local geometry of the representation space. We demonstrate SoftStep's potential in prediction by incorporating it into a neighbor-based predictor, where it improves performance on multiple datasets. Looking forward, we argue that SoftStep offers a principled extension to topological Representational Similarity Analysis (tRSA) for neural alignment, enabling models to learn GT-like similarity transformations for enforcing alignment. This positions SoftStep as a general tool for learning similarity functions in neural networks.
Submission Number: 92
Loading