Keywords: Mesoscience, Compromise in competition, Machine learning, Generalizability
Abstract: The black-box nature is one of bottlenecks constraining machine learning (ML) models, especially, neural networks, from playing a more important role in the field of engineering. The decision-making process of the model often lacks transparency and is difficult to interpret, which limits its use in the high-risk domain. Thus, explaining the generalizability of ML models is a crucial topic in the field of AI. However, there has been no unified understanding of this issue. This work attempts to introduce the concept of compromise in competition (CIC) in mesoscience into the explanation of the generalizability of ML models. In this work, a scale decomposition method is proposed from the perspective of training samples, and the CIC between memorizing and forgetting, refined as dominant mechanisms, is studied. Empirical studies on computer vision (CV) and natural language processing (NLP) datasets demonstrate that the CIC between memorizing and forgetting significantly influences model generalizability. Additionally, dropout, L2 regularization, etc., aimed at mitigating overfitting, can be uniformly reinterpreted through the CIC between memorizing and forgetting. Collectively, this work proposes a new perspective to explain the generalizability of ML models, in order to provide inherent support for further applications of ML in the field of engineering.
Primary Area: interpretability and explainable AI
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 2734
Loading