🇩🇪 german-multilingual-e5-small

This model is a 66.0% smaller version of intfloat/multilingual-e5-small for the German language, created using the mtem-pruner space.

This pruned model should perform similarly to the original model for German language tasks with a much smaller memory footprint. However, it may not perform well for other languages present in the original multilingual model as tokens not commonly used in German were removed from the original multilingual model's vocabulary.

Usage

You can use this model with the Transformers library:

from transformers import AutoModel, AutoTokenizer

model_name = "rhlsinghal1s/german-multilingual-e5-small"
model = AutoModel.from_pretrained(model_name, trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained(model_name, trust_remote_code=True, use_fast=True)

Or with the sentence-transformers library:

from sentence_transformers import SentenceTransformer

model = SentenceTransformer("rhlsinghal1s/german-multilingual-e5-small")

Credits: cc @antoinelouis

Downloads last month
1,840
Safetensors
Model size
40M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for rhlsinghal1s/german-multilingual-e5-small

Quantized
(3)
this model
Quantizations
1 model