madlad400-finetuned-lag-swh
This model is a fine-tuned version of facebook/nllb-200-distilled-1.3B
for translation from Rangi to Swahili.
Model details
- Developed by: SIL Global
- Finetuned from model: facebook/nllb-200-distilled-1.3B
- Model type: Translation
- Source language: Rangi (
lag
) - Target language: Swahili (
swh
) - License: closed/private
Datasets
The model was trained on a parallel corpus of plain text files:
Rangi:
- Rangi New Testament
- License: All rights reserved, Wycliffe Bible Translators. Used with permission.
Swahili:
- Swahili back-translation of Rangi New Testament
- License: All rights reserved, Wycliffe Bible Translators. Used with permission.
Framework versions
- PEFT 0.12.0
- Transformers 4.44.2
- Pytorch 2.4.1+cu124
- Datasets 2.21.0
- Tokenizers 0.19.1
Usage
You can use this model with the transformers
library like this:
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("sil-ai/madlad400-finetuned-lag-swh")
model = AutoModelForSeq2SeqLM.from_pretrained("sil-ai/madlad400-finetuned-lag-swh")
inputs = tokenizer("Your input text here", return_tensors="pt")
outputs = model.generate(**inputs)
print(tokenizer.decode(outputs[0]))
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support