ModernGBERT 1B

This is a German ModernBERT 1B language model trained from scratch using the ModernBERT codebase and the same German portion of RedPajama V2 as our LLäMmlein family. Find more details in our preprint!

Usage

from transformers import AutoModel, AutoTokenizer

model = AutoModel.from_pretrained("LSX-UniWue/ModernGBERT_1B")

tokenizer = AutoTokenizer.from_pretrained("LSX-UniWue/ModernGBERT_1B")

Performance

We evaluated our model on the SuperGLEBer benchmark.

Downloads last month
180
Safetensors
Model size
1.13B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train LSX-UniWue/ModernGBERT_1B

Collection including LSX-UniWue/ModernGBERT_1B