Neural-Driven Multi-criteria Tree Search for Paraphrase GenerationDownload PDF

Published: 12 Dec 2020, Last Modified: 05 May 2023LMCA2020 PosterReaders: Everyone
Keywords: paraphrase generation, multi-objectives search problem, BERT, GPT2, Monte-Carlo Tree Search, Pareto Tree Search, edition-based text generation, non autoregressive text generation
TL;DR: A multi-criteria lattice exploration experiment that leverages on BERT and GPT2 to generate a Pareto set of paraphrases.
Abstract: A good paraphrase is semantically similar to the original sentence but it must be also well formed, and syntactically different to ensure diversity. To deal with this trade-off, we propose to cast the paraphrase generation task as a multi-objectives search problem on the lattice of text transformations. We use BERT and GPT2 to measure respectively the semantic distance and the correctness of the candidates. We study two search algorithms: Monte-Carlo Tree Search For Paraphrase Generation (MCPG) and Pareto Tree Search (PTS) that we use to explore the huge sets of candidates generated by applying the PPDB-2.0 edition rules. We evaluate this approach on 5 datasets and show that it performs reasonably well and that it outperforms a state-of-the-art edition-based text generation method.
1 Reply

Loading