Learning Symbolic Representations Through Joint GEnerative and DIscriminative Training (Extended Abstract)Download PDF

Published: 16 Jun 2023, Last Modified: 19 Jun 2023IJCAI 2023 Workshop KBCG PosterReaders: Everyone
Keywords: Self-supervised learning, energy-based models, neuro-symbolic learning
TL;DR: A unified framework for self-supervised and likelihood-based generative models easily extendable to neuro-symbolic learning
Abstract: We introduce GEDI, a Bayesian framework that combines existing self-supervised learning objectives with likelihood-based generative models. This framework leverages the benefits of both GEnerative and DIscriminative approaches, resulting in improved symbolic representations over standalone solutions. Additionally, GEDI can be easily integrated and trained jointly with existing neuro-symbolic frameworks without the need for additional supervision or costly pre-training steps. We demonstrate through experiments on real-world data, including SVHN, CIFAR10, and CIFAR100, that GEDI outperforms existing self-supervised learning strategies in terms of clustering performance by a significant margin. The symbolic component further allows it to leverage knowledge in the form of logical constraints to improve performance in the small data regime and to overcome the problem of representational collapse.
0 Replies

Loading