MinpakuBERT: A Language Model for Understanding Cultural Properties in MuseumDownload PDFOpen Website

Published: 01 Jan 2022, Last Modified: 07 Jun 2023IIAI-AAI 2022Readers: Everyone
Abstract: In this paper, we propose a BERT model for understanding cultural properties in museum. BERT is a language model used in natural language processing. Generally, BERT is pre-trained on a large corpus and then fine-tuned on a specific task. Recently, additional learning for BERT has been reported to improve task performance in fine tuning. Therefore, we propose a BERT model for understanding cultural properties in museum (MinpakuBERT) by additional learning a BERT model using document data from the National Museum of Ethnology. To evaluate the performance of MinpakuBERT, we compared the performance of MinpakuBERT and pre-trained BERT by fine-tuning them on two tasks related to cultural properties. The two tasks for comparison were classification tasks using the OCM (Outline of Cultural Materials) label for function and the OWC (Outline of World Cultures) label for region. As a result, the performance of MinpakuBERT is higher than the pre-trained BERT model in the two tasks. In addition, the learning speed in fine tuning was also faster for MinpakuBERT than for the pre-trained BERT. The MinpakuBERT was published on a model-sharing website and made available as a language resource for anyone to use.
0 Replies

Loading