Open Peer Review. Open Publishing. Open Access. Open Discussion. Open Directory. Open Recommendations. Open API. Open Source.
Circumventing negative transfer via cross generative initialisation
Wenjun Bai, Changqin Quan, Zhi-Wei Luo
Feb 12, 2018 (modified: Feb 12, 2018)ICLR 2018 Workshop Submissionreaders: everyone
Negative transfer – a special type of transfer learning – refers to the interference of the previous knowledge with new learning. In this research, through an empirical study, we demonstrate the futile defence to the negative transfer via conventional neural network based transfer techniques, i.e., mid-level feature extraction and knowledge distillation. Under a finer specification of transfer learning, we speculate the real culprits of negative transfer are the incongruence on task and model complexity and the ordering of learning. Based on this speculation, we propose a tentative transfer learning technique, i.e., cross generative initialisation, to sidestep the negative transfer. The effectiveness of cross generative initialisation was evaluated empirically.
TL;DR:The proposed cross generative initialisation is superior to the conventional transfer learning techniques in sidestepping the negative transfer.
Keywords:transfer learning, negative transfer
Enter your feedback below and we'll get back to you as soon as possible.