For T2T task of Workshop on Asian Translation(2025), these are the fine-tuned models with NLLB-200-XB as base model, with WAT + 100k samanantar pairs.
Debasish Dhal
DebasishDhal99
AI & ML interests
None yet
Recent Activity
liked
a Space
1 day ago
HuggingFaceTB/smol-training-playbook
updated
a collection
5 days ago
WAT-2025-FinetunedModels
updated
a collection
5 days ago
WAT-2025-FinetunedModels