Using Commonsense to Guide Dialog Structure Induction via Neural Probabilistic Soft LogicDownload PDF

Anonymous

16 Oct 2022 (modified: 05 May 2023)ACL ARR 2022 October Blind SubmissionReaders: Everyone
Keywords: Dialog Structure Induction, NeuPSL, PSL, DD-VRNN
Abstract: Latent Structure Induction from task-oriented dialogs would be made more robust and data-efficient by injecting symbolic knowledge into a neural learning process. We introduce Neural Probabilistic Soft Logic Dialogue Structure Induction (NeuPSL DSI), a general and principled approach that injects the symbolic knowledge into the latent space of a neural generative model via the Probablistic Soft Logic (PSL) formalism and allows for end-to-end gradient training. We conduct a thorough empirical investigation on the effect of NeuPSL DSI learning on the representation quality, few-shot learning, and out-of-domain generalization performance of the neural network. Over three simulated and real-world dialog structure induction benchmarks and across both unsupervised and semi-supervised settings for standard and cross-domain generalization, the injection of symbolic knowledge using NeuPSL DSI in unsupervised and semi-supervised settings provides a consistent boost in performance over the canonical baselines.
Paper Type: long
Research Area: Machine Learning for NLP
0 Replies

Loading