nllb-200-distilled-1.3B-ct2-int8

This model is used in nllb-api.

Generation

The model was generated with the following command.

ct2-transformers-converter --model facebook/nllb-200-distilled-1.3B --quantization int8 --output_dir converted/nllb-200-distilled-1.3B-ct2-int8
Downloads last month
6,098
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Spaces using winstxnhdw/nllb-200-distilled-1.3B-ct2-int8 6