Llama 4 not working with MLX VLM?

#2
by leadangle - opened
MLX Community org

The model mlx-community/Llama-4-Scout-17B-16E-Instruct-4bit was successfully downloaded and tested for text inference. However, despite following the official example command, the current latest version of mlx-vlm (0.1.21) does not support the "llama4" model type embedded in this model's configuration, resulting in an error during image inference attempts. This indicates that image-based inference is not currently supported with this model and the available mlx-vlm version.

same issue here, update mlx-vlm to 0.1.21 in my lmstudio, and also re-download config.json from this repository, still get this error

Error when loading model: ValueError: Model type llama4 not supported.

Sign up or log in to comment