The Research Core Dataset (KDSF) in VIVO

Published: 05 Jun 2019, Last Modified: 05 May 2023VIVO 2019Readers: Everyone
Keywords: reporting, data model, alignment
TL;DR: The interaction of developments and adaptations necessary for the use of KDSF in VIVO with regard to ontology, infrastructure and software is described.
Abstract: In this poster we present our activities aimed at using the Research Core Dataset (KDSF) - a national German standard for reporting - in the VIVO context. In recent times, a constant interest in implementing KDSF into various types of Current Research Information Systems (CRIS) among German research institutions can be observed. At the TIB, an non-public VIVO for KDSF-compliant reporting is being developed. The scope of the activities around KDSF in VIVO covers the alignment of KDSF and VIVO data models, the implementation of additional datasets to meet KDSF requirements, definition of data entry workflows as well as the development of a reporting component.The data models were aligned for both data input and data export, with VIVO ontology being reused as far as possible. Furthermore, KDSF-compliant data recording requiers annotation of entities with subjects from the classification of the German Federal Office of Statistics. To enable the usage of the classification in VIVO, we have converted it into a SKOS concept scheme and made it available on a Skosmos server - readable for both humans and machines. Due to the heterogeneity of institutional data sources and formats a number of individually customized workflows for automated data ingest and update are necessary. The reporting component - the Vitro Query Tool - allows reusing, sharing and scheduling of SPARQL queries for reporting.
ORCID: http://orcid.org/0000-0001-8127-2988
Submission Type: poster
3 Replies

Loading