ONNX version of google/madlad400-3b-mt
Converted and quantized with optimum-cli
- Convert to ONNX:
optimum-cli onnxruntime export --model google/madlad400-3b-mt <output_path> --legacy
- Quantization:
optimum-cli onnxruntime quantize --onnx_model <input_model_path> -o <output_model_path> --avx512_vnni
- Downloads last month
- 15
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for ISoloist1/madlad400-3b-mt-onnx
Base model
google/madlad400-3b-mt