HeLo: Learning-Free Lookahead Decoding for Conversation Infilling

Published: 01 Jan 2022, Last Modified: 30 Sept 2024EMNLP (Findings) 2022EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: We propose Heuristic Guided Lookahead Decoding (HeLo), a novel decoding strategy for conversation infilling. Conversation infilling aims to generate a seamless bridge of utterances connecting a given pair of source and target utterances. HeLo does not require fine-tuning or extra models – only the generating model itself. Instead, HeLo leverages a greedy lookahead phase before committing to any token. The HeLo framework is simple and can augment conventional decoding strategies paired with any autoregressive language model. Smooth transitions between utterances are encouraged with an annealing schedule. Our experiments show HeLo outperforms several baselines when evaluated with both automatic and human evaluation metrics, which, we argue, are appropriate for the task.
Loading