Keywords: heuristic search algorithm, structured decoding, information extraction, key information extraction, text-to-table, permutation-based language modeling, flexible generation
TL;DR: Text-to-table framework based on a permutation-based decoder that considers all possible cell orderings to maximize confidence, which not only boosts performance by up to 15%, but also speeds up inference fourfold without sacrificing quality.
Abstract: We present a permutation-based text-to-table neural framework that unifies diverse NLP tasks into table outputs. The framework uses a probabilistic approach during training, maximizing the expected log-likelihood across all random permutations of table content factorization. At the inference stage, we optimize model uncertainties and minimize error propagation by leveraging the model's ability to generate cells in any order.
Our method accelerates inference by up to 4$\times$ on some datasets and improves text-to-table performance by up to 15\% over previous solutions, all while preserving output quality.
Submission Number: 50
Loading