medBERT.de: A comprehensive German BERT model for the medical domain

Published: 01 Jan 2024, Last Modified: 14 Oct 2025Expert Syst. Appl. 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Highlights•medBERT.de: German BERT model tailored for medical domain expertise.•Trained using 4.7 million diverse medical documents for optimal results.•Achieved new state-of-the-art on eight medical NLP benchmarks.•Efficient tokenization has minor role, focus on domain-specific training.•Public release of pre-trained model weights and new benchmarks for research.
Loading