Contrastive Graph Representations for Logical Formulas Embedding (Extended Abstract)

Published: 01 Jan 2024, Last Modified: 20 May 2025ICDE 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Embedding symbolic logical formulas into a low-dimensional continuous space provides an effective way for the Neural-Symbolic system. However, current studies are all constrained by the syntactic structure modeling and fail to preserve intrinsic semantics. To this end, we propose a novel model of Contrastive Graph Representations (ConGR) for logical formulas embedding. Firstly, it introduces a densely connected graph convolutional network (GCN) with an attention mechanism to process syntax parsing graphs of formulas. Secondly, the contrastive instances for each anchor formula are generated by the transformation under the guidance of logical properties. Two types of contrast, global-local and global-global, are carried out to refine formula embeddings with semantic information. Extensive experiments demonstrate that ConGR obtains superior performance against state-of-the-art baselines.
Loading