Abstract: Highlights•medBERT.de: German BERT model tailored for medical domain expertise.•Trained using 4.7 million diverse medical documents for optimal results.•Achieved new state-of-the-art on eight medical NLP benchmarks.•Efficient tokenization has minor role, focus on domain-specific training.•Public release of pre-trained model weights and new benchmarks for research.
Loading