Configuration Parsing
Warning:
In config.json: "quantization_config.bits" must be an integer
- Downloads last month
- 7
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for allura-quants/allura-org_Q3-30b-A3b-Pentiment_EXL3_3.5bpw_H6
Base model
allura-forge/q3-30b-rc1
Finetuned
allura-org/Q3-30b-A3b-Pentiment