Dynamic Programming in Rank Space: Scaling Structured Inference with Low-Rank HMMs and PCFGsDownload PDF

Anonymous

08 Mar 2022 (modified: 05 May 2023)NAACL 2022 Conference Blind SubmissionReaders: Everyone
Paper Link: https://openreview.net/forum?id=KBpfIEHa9Th
Paper Type: Long paper (up to eight pages of content + unlimited references and appendices)
Abstract: Hidden Markov Models (HMMs) and Probabilistic Context-Free Grammars (PCFGs) are widely used structured models, both of which can be represented as factor graph grammars (FGGs), a powerful formalism capable of describing a wide range of models. Recent research found it beneficial to use large state spaces for HMMs and PCFGs. However, inference with large state spaces is computationally demanding, especially for PCFGs. To tackle this challenge, we leverage tensor rank decomposition (aka.\ CPD) to decrease inference computational complexities for a subset of FGGs subsuming HMMs and PCFGs. We apply CPD on the factors of an FGG and then construct a new FGG defined in the rank space. Inference with the new FGG produces the same result but has a lower time complexity when the rank size is smaller than the state size. We conduct experiments on HMM language modeling and unsupervised PCFG parsing, showing better performance than previous work. Our code is publicly available at \url{https://github.com/VPeterV/RankSpace-Models}.
Presentation Mode: This paper will be presented virtually
Virtual Presentation Timezone: UTC-8
Copyright Consent Signature (type Name Or NA If Not Transferrable): Songlin Yang
Copyright Consent Name And Address: ShanghaiTech University, No.393 Huaxia Middle Road Pudong, Shanghai, China
0 Replies

Loading