TOWARD EFFICIENT INFLUENCE FUNCTION: DROPOUT AS A COMPRESSION TOOL

Published: 06 Mar 2025, Last Modified: 07 Mar 2025ICLR 2025 Workshop Data Problems PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Influence function, Data attribution
Abstract: Assessing the impact of training data points on machine learning models is crucial for understanding the behavior of the model and enhancing the transparency of modern models. Influence function provides a theoretical framework for quantifying the effect of individual training data points on a model's performance on given specific test data points. However, the computational cost of influence function presents significant challenges, particularly for large-scale models. In this work, we introduce a novel approach that leverages dropout as a gradient compression mechanism to compute the influence functions more efficiently. Our methods significantly reduces computational and memory overhead, not only during the influence function computation but also in the compression process itself. Through theoretical analysis and empirical validation, we demonstrate that using dropout as a compression tool in influence function computation preserves critical components of the data influence and enables its application to modern large-scale models.
Submission Number: 35
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview