Training Dynamics for Curriculum Learning: A Study on Monolingual and Cross-lingual NLUDownload PDF

Anonymous

16 Feb 2022 (modified: 05 May 2023)ACL ARR 2022 February Blind SubmissionReaders: Everyone
Abstract: Curriculum Learning (CL) is a technique of training models via ranking examples in a typically increasing difficulty trend with the aim of accelerating convergence and improving generalisability. Current approaches for Natural Language Understanding (NLU) tasks use CL to improve in-distribution data performance often via heuristic-oriented difficulties or task-agnostic ones. In this work, instead, we employ CL for NLU by taking advantage of training dynamics as difficulty metrics, i.e. statistics that measure the behavior of the model at hand on specific task-data instances during training and propose modifications of existing CL schedulers based on these statistics. Differently from existing works, we focus on evaluating models on in-distribution, out-of-distribution as well as zero-shot cross-lingual transfer datasets. We show across several NLU tasks that CL with training dynamics can result in better performance mostly on zero-shot cross-lingual transfer and OOD settings with improvements up by 8.5%. Overall, experiments indicate that training dynamics can lead to better performing models with smoother training compared to other difficulty metrics while at the same time being up to 51% faster. In addition, through analysis we shed light on the correlations of task-specific versus task-agnostic metrics.
Paper Type: long
0 Replies

Loading