A Scalable Technique for Weak-Supervised Learning with Domain ConstraintsDownload PDF

Published: 20 Oct 2022, Last Modified: 05 May 2023HITY Workshop NeurIPS 2022Readers: Everyone
Keywords: Neuro-Symbolic AI, Domain Constraints, Mathematical Optimization, MNIST Digit Classification
TL;DR: A scalable technique for NN learning for classifying unlabeled data using domain knowledge as constraints. Evaluated on MNIST classification without labels but with sum constraint, and shown to scale better than the state of the art approaches.
Abstract: We propose a novel scalable end-to-end pipeline that uses symbolic domain knowledge as constraints for learning a neural network for classifying unlabeled data in a weak-supervised manner. Our approach is particularly well-suited for settings where the data consists of distinct groups (classes) that lends itself to clustering-friendly representation learning and the domain constraints can be reformulated for use of efficient mathematical optimization techniques by considering multiple training examples at once. We evaluate our approach on a variant of the MNIST image classification problem where a training example consists of image sequences and the sum of the numbers represented by the sequences, and show that our approach scales significantly better than previous approaches that rely on computing all constraint satisfying combinations for each training example.
Supplementary Material: zip
4 Replies

Loading