Teacher or supervisor? Effective online knowledge distillation via guided collaborative learning

Published: 01 Jan 2023, Last Modified: 07 Nov 2025Comput. Vis. Image Underst. 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Highlights•Online elitist multiple students ensembling distillation framework with a supervisor.•Supervisor learns student expertise using: input, ground truth, student predictions.•Supervisor - discarded at test time. Only the best student is extracted and used.•Extensive experiments show consistent improvements over vanilla trained students.
Loading