Activate Integrated Controllable Generation with Soft Prompt

Published: 2024, Last Modified: 06 Jan 2026NLPCC (4) 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Parameter-efficient transfer learning (PETL) methods have gained significant adoption in downstream tasks due to their ability to reduce the cost of tuning pre-trained language models. However, a tradeoff between performance and efficiency remains. However, controllable Text Generation (CTG) requires a precise understanding of diverse constraints to mitigate potential degradation in generation quality. In contrast to single-attribute CTG, multi-attribute CTG amplifies the tuning complexity for PETL methods. To address this challenge, we propose Activator, a PETL approach that accommodates CTG tasks with higher diversity and offers fine-grained control. Activator leverages an external module to enhance optimization and enriches the soft prompt representations. Our experimental results on table-to-text and poetry generation tasks demonstrate that Activator exhibits remarkable competitiveness compared to other PETL methods when applied to both casual language model and sequence-to-sequence language models. Furthermore, we observe that Activator demonstrates strong performance even in extremely complex CTG scenarios. The source code is publicly available at https://github.com/NLP2CT/Activator.
Loading