A diversity-enhanced knowledge distillation model for practical math word problem solving

Published: 01 Jan 2025, Last Modified: 15 May 2025Inf. Process. Manag. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Highlights•We propose a novel diversity-enhanced knowledge distillation model, which adaptively selects high-quality teacher labels as diversity labels for student learning, and estimates the quality of the intermediate soft labels for training student discernment.•We introduce a diversity prior-enhanced student model by incorporating CVAE into existing models, enabling them to capture the latent diversity equations.•We conduct extensive experiments on all the four benchmarks. The results show that the proposed methods effectively improve the performance of existing models without sacrificing model efficiency.
Loading