Sample Relationships through the Lens of Learning Dynamics with Label InformationDownload PDF

06 Oct 2022 (modified: 05 May 2023)INTERPOLATE at NeurIPS 2022Readers: Everyone
Keywords: Sample Relationship, Neural Tangent Kernel, Neural Networks, Learning Dynamics, Label Information
TL;DR: In this work, we propose a new extension of NTK and show how it helps to understand the forgetting events.
Abstract: Although much research has been done on proposing new models or loss functions to improve the generalisation of artificial neural networks (ANNs), less attention has been directed to the data, which is also an important factor for training ANNs. In this work, we start from approximating the interaction between two samples, i.e. how learning one sample would modify the model's prediction on the other sample. Through analysing the terms involved in weight updates in supervised learning, we find that the signs of labels influence the interactions between samples. Therefore, we propose the labelled pseudo Neural Tangent Kernel (lpNTK) which takes label information into consideration when measuring the interactions between samples. We first prove that lpNTK would asymptotically converge to the well-known empirical Neural Tangent Kernel in terms of the Frobenius norm under certain assumptions. Secondly, we illustrate how lpNTK helps to understand learning phenomena identified in previous work, specifically the learning difficulty of samples and forgetting events during learning. Moreover, we also show that lpNTK can help to improve the generalisation performance of ANNs in image classification tasks, compared with the original whole training sets.
0 Replies

Loading