Class Expression Learning with Permutation-Invariant EmbeddingsDownload PDF

Anonymous

04 Oct 2022 (modified: 05 May 2023)Submitted to nCSI WS @ NeurIPS 2022Readers: Everyone
Keywords: Description Logic, Class Expression Learning, Permutation-invariant Representations
TL;DR: We propose a permutation-invariant neural model to learn description logic concept expressions from input example sets and an RDF knowledge base.
Abstract: Class expression learning deals with learning description logic concepts from an RDF knowledge base and input examples. The goal is to learn a concept that covers all positive examples, while not covering any negative examples. Although state-of-the-art models have been successfully applied to tackle this problem, their large-scale applications have been severely hindered due to their impractical runtimes. Arguably, this limitation stems from their needs for exploring numerous expressions. Here, we investigate a remedy for this limitation. We formulate the class expression learning problem as a multi-label classification problem and we propose a permutation-invariant embedding model (Nero) to reduce the rate of exploration. For a given learning problem, Nero accurately predicts quality of pre-selected description logic concepts for a given input example sets. Through ranking concepts in descending order of predicted qualities, the standard search procedure can start in multiple advantageous regions of the quasi-ordered search space. Our experiments on 5 benchmark datasets with 770 learning problems suggest that using \approach led to significant improvements (p-value <1\%) in the number of explored expressions and the total runtime time.
1 Reply

Loading