llava-v1.6-mistral-7b-hf-nf4 is a bnb nf4 quant of llava-v1.6-mistral-7b-hf.
For batch processing you can use ide-cap-chan
All other features are inherited from the parent model.
- Downloads last month
- 10
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
HF Inference API was unable to determine this model's library.
Model tree for 2dameneko/llava-v1.6-mistral-7b-hf-nf4
Base model
llava-hf/llava-v1.6-mistral-7b-hf