Is Probing All You Need? Indicator Tasks as an Alternative to Probing Embedding Spaces

Published: 07 Oct 2023, Last Modified: 01 Dec 2023EMNLP 2023 FindingsEveryoneRevisionsBibTeX
Submission Type: Regular Long Paper
Submission Track: Interpretability, Interactivity, and Analysis of Models for NLP
Keywords: Probing, Probe, Indicator, Word Representations, Embedding Space, Interpretability, Context, Social Bias, Morphology, Semantics, Gender, Concept Erasure
TL;DR: Proposing indicators (probes that do not involve training an auxiliary model) to elicit information from embedding spaces, aside from classifier-based probes, due to their inherent problems
Abstract: The ability to identify and control different kinds of linguistic information encoded in vector representations of words has many use cases, especially for explainability and bias removal. This is usually done via a set of simple classification tasks, termed \textit{probes}, to evaluate the information encoded in the embedding space. However, the involvement of a trainable classifier leads to entanglement between the probe’s results and the classifier’s nature. As a result, contemporary works on probing include tasks that do not involve training of auxiliary models. In this work we introduce the term \textit{indicator tasks} for non-trainable tasks which are used to query embedding spaces for the existence of certain properties, and claim that this kind of tasks may point to a direction opposite to probes, and that this contradiction complicates the decision on whether a property exists in an embedding space. We demonstrate our claims with two test cases, one dealing with gender debiasing and another with the erasure of morphological information from embedding spaces. We show that the application of a suitable indicator provides a more accurate picture of the information captured and removed compared to probes. We thus conclude that indicator tasks should be implemented and taken into consideration when eliciting information from embedded representations.
Submission Number: 2562
Loading