Boosting the Performance of Generic Deep Neural Network Frameworks with Log-supermodular CRFsDownload PDF

Published: 31 Oct 2022, Last Modified: 22 Jan 2023NeurIPS 2022 AcceptReaders: Everyone
Keywords: conditional random fields, log-supermodular, structured prediction
Abstract: Historically, conditional random fields (CRFs) were popular tools in a variety of application areas from computer vision to natural language processing, but due to their higher computational cost and weaker practical performance, they have, in many situations, fallen out of favor and been replaced by end-to-end deep neural network (DNN) solutions. More recently, combined DNN-CRF approaches have been considered, but their speed and practical performance still falls short of the best performing pure DNN solutions. In this work, we present a generic combined approach in which a log-supermodular CRF acts as a regularizer to encourage similarity between outputs in a structured prediction task. We show that this combined approach is widely applicable, practical (it incurs only a moderate overhead on top of the base DNN solution) and, in some cases, it can rival carefully engineered pure DNN solutions for the same structured prediction task.
TL;DR: A novel framework using log-supermodular conditional random fields (CRFs) to smooth/boost the performance of existing deep neural network models in a variety of domains.
Supplementary Material: pdf
8 Replies

Loading