itzmealvin/MLX_Qwen2VL-2B-IT-Q8
This model was converted to MLX format from erax-ai/EraX-VL-2B-V1.5
using
mlx-vlm version 0.1.23. Refer to the
original model card for more
details on the model.
Use with mlx
pip install -U mlx-vlm
python -m mlx_vlm.generate --model itzmealvin/MLX_Qwen2VL-2B-IT-Q8 --max-tokens 100 --temperature 0.0 --prompt "Describe this image." --image <path_to_image>
- Downloads last month
- 12
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support