|
--- |
|
license: cc-by-nc-3.0 |
|
language: |
|
- da |
|
pipeline_tag: fill-mask |
|
tags: |
|
- bert |
|
- danish |
|
widget: |
|
- text: Hvide blodlegemer beskytter kroppen mod [MASK] |
|
--- |
|
|
|
|
|
# Danish medical BERT |
|
|
|
MeDa-BERT was initialized with weights from a pretrained Danish BERT model (https://huggingface.co/Maltehb/danish-bert-botxo). Next, it was fine-tuned for 48 epochs using the MLM objective on a Danish medical corpus of 123M tokens. |
|
The development of the corpus and model is described further in the paper: |
|
|
|
Here is an example on how to load the model in PyTorch using the [🤗Transformers](https://github.com/huggingface/transformers) library: |
|
|
|
|
|
|
|
```python |
|
from transformers import AutoTokenizer, AutoModelForMaskedLM |
|
tokenizer = AutoTokenizer.from_pretrained("jannikskytt/MeDa-Bert") |
|
model = AutoModelForMaskedLM.from_pretrained("jannikskytt/MeDa-Bert") |
|
``` |