LMFingerprints: Visual Explanations of Language Model Embedding Spaces through Layerwise Contextualization Scores

Abstract: Language models, such as BERT, construct multiple, contextualized embeddings for each word occurrence in a corpus. Understanding how the contextualization propagates through the model's layers is cru...
0 Replies
Loading