Mask and Understand: Evaluating the Importance of ParametersDownload PDF

29 Sept 2021 (modified: 13 Feb 2023)ICLR 2022 Conference Withdrawn SubmissionReaders: Everyone
Keywords: influence function, interpretability, model pruning, feature importance ranking
Abstract: Influence functions are classic techniques from robust statistics based on first-order Taylor approximations that have been widely used in the machine learning community to estimate small perturbations of datasets accurately to the model. However, existing researches concentrate on the estimate the perturbations of the training or pre-training points. In this paper, we introduce the influence functions to predict the effects of removing features or parameters. It is worth emphasizing that our method can be applied to explore the influence of any combination of parameters disturbance on the model whether they belong to the same layer or whether are related. The validation and experiments also demonstrate that the influence functions for parameters can be used in many fields such as understanding model structure, model pruning, feature importance ranking, and any other strategies of masking parameters as you can imagine when you want to evaluate the importance of a group of parameters.
One-sentence Summary: We propose a parameter-based influence function which can be applied to explore the influence of any combination of parameters disturbance on the model whether they belong to the same layer or whether they are related.
4 Replies

Loading