Keywords: SHACL, Ontology Quality, NEOntometrics, Data Quality
Abstract: Semantic Web technologies have transformed the processing and representation of data. Initially used for linking publicly available knowledge, it is now widely adopted in enterprise contexts. Enterprise knowledge graphs (KGs) often use the Shape Constraint Language (SHACL) to validate data structure and completeness. SHACL constraints validate whether newly ingested data conforms to business and data rules, ensuring that data conforms to self-set standards and is interoperable in the long term. However, these constraints can be complex and demanding to manage, as they continue to develop to cater with the variety and complexity of the data they validate. Therefore, it is crucial to ensure the quality of such restrictions. One way of measuring the quality of SHACL shapes is through ontology metrics that translate the qualitative nature of ontologies into objective quantitative measurements. Over the past few years, various ontology metric frameworks have been published. However, they are often targeted for inference languages like OWL and fail to address the validation specifics of SHACL. This paper fills this gap by presenting SHACLEval, an evaluation framework for SHACL. SHACLEval proposes measures that assess the specific SHACL-language constructs. The novel metrics link the data strategy with relevant KPIs, enabling the detection of potential discrepancies between the KG strategy and development execution. The case is motivated by a Bosch Use-Case and demonstrated on a public SHACL repository.
Submission Number: 7
Loading