Supermodel: Rethinking DNN Training and Testing with Open-style Skill Acquisition and Dynamic Inference

ICLR 2026 Conference Submission15263 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Machine Learning, Generalization, Deep Neural Network, Continual Learning, Lifelong Learning, Catastrophic Forgetting, Ensemble Modeling
Abstract: Current DNN model building suffers from two serious problems: forgetting and doomed test cases. In this paper, we propose an open-style skill acquisition approach, which is the opposite of a currently closed-style training scheme with recent features/patterns often overwriting previous ones to minimize the overall loss in backpropagation (the forgetting problem). Testing is also drastically different and is conducted as optimally selecting the best available skills (nodes and connections in DNNs) from the training model specific to a testing sample in order to maximize its probability to be correctly processed (the doomed test case problem). We validate our approach with multiple datasets and achieve significant performance improvement over SOTA methods.
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Submission Number: 15263
Loading