Circumventing negative transfer via cross generative initialisationDownload PDF

12 Feb 2018 (modified: 05 May 2023)ICLR 2018 Workshop SubmissionReaders: Everyone
Abstract: Negative transfer – a special type of transfer learning – refers to the interference of the previous knowledge with new learning. In this research, through an empirical study, we demonstrate the futile defence to the negative transfer via conventional neural network based transfer techniques, i.e., mid-level feature extraction and knowledge distillation. Under a finer specification of transfer learning, we speculate the real culprits of negative transfer are the incongruence on task and model complexity and the ordering of learning. Based on this speculation, we propose a tentative transfer learning technique, i.e., cross generative initialisation, to sidestep the negative transfer. The effectiveness of cross generative initialisation was evaluated empirically.
Keywords: transfer learning, negative transfer
TL;DR: The proposed cross generative initialisation is superior to the conventional transfer learning techniques in sidestepping the negative transfer.
4 Replies

Loading