Training Shallow Phrase Structure Parsers From Semantic Frames and Simple NLGDownload PDF

Anonymous

15 Oct 2020 (modified: 05 May 2023)Submitted to HAMLETS @ NeurIPS2020Readers: Everyone
Keywords: parsing, human in the loop, sequence tagging
TL;DR: standoff sematic representations from human in loop plus simple NLG yields fast and easy way to create training data for NLU models
Abstract: Determining the meaning of customer utterances is an important part of fulfilling customer requests in task-oriented dialogue. Natural Language Understanding (NLU) models can determine this meaning, but typically require many customer utterances that are hand-annotated with meaning representations, which are difficult to obtain and must be repeated for each new target domain. One way to reduce the labor involved in hand annotation is to have the human annotate a meaning representation (a ``semantic frame'' representation) separate from the corresponding utterance. In this work, we investigate the use of this approach in conjunction with several simple natural language generation (NLG) approaches in order to train shallow parsers to extract phrase structure representations from customer utterances. Our results show the effectivness of this approach for training NLU models.
0 Replies

Loading