Preventing Overfitting on Noisy Labels Through Adaptive Checkpointing

Published: 24 Feb 2025, Last Modified: 15 May 2025OpenReview Archive Direct UploadEveryoneCC BY-NC-ND 4.0
Abstract:

Overfitting and the presence of noisy labels are significant challenges in the training of machine learning models, particularly in complex datasets. This paper introduces a novel checkpointing method designed to mitigate overfitting while maintaining model calibration in the presence of noise. Our approach leverages the entropy of the two largest predicted probabilities from the model's output to determine when to save the model during training. This entropy-based criterion allows for effective checkpointing without necessitating a separate validation set. We evaluate our method across various datasets characterized by different noise levels and model architectures. The results demonstrate that our entropy-based checkpointing offers a more robust solution for managing model performance during training. This work contributes to the ongoing efforts to enhance model reliability in real-world applications where data imperfections are prevalent.

Loading