$\textbf{\emph{CLMSM}}$: A Multi-Task Learning Framework for Pre-training on Procedural Text

Published: 07 Oct 2023, Last Modified: 01 Dec 2023EMNLP 2023 FindingsEveryoneRevisionsBibTeX
Submission Type: Regular Long Paper
Submission Track: NLP Applications
Submission Track 2: Commonsense Reasoning
Keywords: pre-training, procedural reasoning, contrastive learning, masked language modeling, multi-task learning, nlp
TL;DR: We propose CLMSM, a pre-training framework that uses multi-task learning with contrastive learning and mask-step modeling objectives for downstream procedural NLP tasks, and outperforms baselines on in-domain as well as open-domain tasks.
Abstract: In this paper, we propose ***CLMSM***, a domain-specific, continual pre-training framework, that learns from a large set of procedural recipes. ***CLMSM*** uses a Multi-Task Learning Framework to optimize two objectives - a) Contrastive Learning using hard triplets to learn fine-grained differences across entities in the procedures, and b) a novel Mask-Step Modelling objective to learn step-wise context of a procedure. We test the performance of ***CLMSM*** on the downstream tasks of tracking entities and aligning actions between two procedures on three datasets, one of which is an open-domain dataset not conforming with the pre-training dataset. We show that ***CLMSM*** not only outperforms baselines on recipes (in-domain) but is also able to generalize to open-domain procedural NLP tasks.
Submission Number: 2159
Loading