Abstract: Highlights•Propose a DFM-based teaching approach to generate loss function dynamically for the neural network optimization.•Propose a confidence-based selection algorithm to select appropriate metrics.•Employ information divergence to assist in integrating the states of a student model to enhance the teacher model.•Conduct extensive experiments on a wide range of loss functions and tasks to demonstrate the effectiveness of our approach.
External IDs:dblp:journals/pr/HaiPLH25
Loading