PGT: a prompt based generative transformer for the patent domainDownload PDF

01 Jun 2022 (modified: 05 May 2023)ICML 2022 Workshop KRLM Readers: Everyone
Keywords: patent generation, nlp, transformers
TL;DR: We present Patent Generative Transformer (PGT), a transformer-based multitask language model trained to facilitate the patent generation process.
Abstract: Patents are a valuable source of knowledge, but drafting them is a time-consuming and expensive task. Methods that assist patent generation can provide a two-fold improvement as they can speed up the generation process and suggest to the inventor ideas and claims. Herein, influenced by recent advances in language modeling via multitask learning and prompt engineering, we present Patent Generative Transformer (PGT), a transformer-based language model trained to facilitate patent drafting. Specifically, the model supports three tasks: part-of-patent generation, text infilling, and patent coherence evaluation. PGT complements inventors and assures the fast and successful transition from their input to a coherent patent disclosure taking advantage of its multitasking nature. We show how the model outperforms a collection of task-specific baselines on relevant metrics. We further test the quality of the generated text via blind testing by subject matter experts. Finally, we explore a zero-shot extension of the model showing how to use PGT for generating domain-specific abstracts.
0 Replies

Loading