Variational Language Concepts for Interpreting Pretrained Language Models

23 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: Conceptual Interpretation, Interpretability and Explainability, Generative Models, Probabilistic Graphical Models, Pretrained Lauguage Models
TL;DR: We propose a hierarchical variational Bayesian framework to provide conceptual interpretations of pretrained language models.
Abstract: Pretrained Language Models (PLMs) such as BERT and its variants have achieved remarkable success in natural language processing. To date, the interpretability of PLMs has primarily relied on the attention weights in their self-attention layers. However, these attention weights only provide word-level interpretations, failing to capture higher-level structures, and are therefore lacking in readability and intuitiveness. To address this challenge, we first provide a formal definition of ``conceptual interpretation`` and then propose a variational Bayesian framework, dubbed VAriational LANguage ConcEpt (VALANCE), to go beyond word-level interpretations and provide concept-level interpretations. Our theoretical analysis shows that our VALANCE finds the optimal language concepts to interpret PLM predictions. Empirical results on several real-world datasets show that our method can successfully provide conceptual interpretation for PLMs.
Primary Area: probabilistic methods (Bayesian methods, variational inference, sampling, UQ, etc.)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 6561
Loading