Does not work in LM Studio
Good Model, I am eager to try it!
But got some error:
Failed to load model
error loading model: llama_model_loader: failed to load model from ...../lm_studio/models/microsoft/bitnet-b1.58-2B-4T-gguf/ggml-model-i2_s.gguf
mac M4 Sequoia
Metal llama.cpp v1.26.0
LM Studio MLX v0.13.2
Same here. Also Mac M4 32GB Sequoia. No problems with loading any other model into LM Studio.
Same. Even if the GPU Offload option is set to 0, the model load fails.
MacBook Pro M3 MAX 64GB Sequoia.
Metal llama.cpp v1.27.1(Beta)
LM Studio MLX v0.13.2
Same. Windows 11,16G. LM Studio 0.3.14
π₯² Failed to load the model
Failed to load model
error loading model: llama_model_loader: failed to load model from D:\AI\Modules\publisher\model\bitnet\bitnet.gguf
Same in Ollama
ollama run hf.co/microsoft/bitnet-b1.58-2B-4T-gguf
Error: unable to load model: C:\Users\User\.ollama\models\blobs\sha256-4221b252fdd5fd25e15847adfeb5ee88886506ba50b8a34548374492884c2162
Same problem with my Mac with MAX
I also have this problem in LM Studio.
same in LM studio 0.3.15
same in Ollama 0.5.4
Hello, With LM studio 0.3.15 i have the same error on my MBpro M1 / 16GB / Sequoia
Does a bitnet runtim exist ?
π₯² Failed to load the model
error loading model: llama_model_loader: failed to load model from /Users/********/.cache/lm-studio/models/microsoft/bitnet-b1.58-2B-4T-gguf/ggml-model-i2_s.gguf
I get the same error, just updated to LM Studio 0.3.16 (Build 8).
I'm curious if anyone is working on this issue? Would love to try out the model.
Same problem.
LM Studio version is 0.3.16.
Error:
π₯² Failed to load the model
Failed to load model
error loading model: llama_model_loader: failed to load model from ***\LM Studio Models\microsoft\bitnet-b1.58-2B-4T-gguf\ggml-model-i2_s.gguf
System: Windows 11 24H2
Intel Core i5-12400f
RTX 4070
RAM 32 GB
Hi all, I am not a dev, but can give the answer to this problem. Bitnet works using bitnet.cpp which is not compatible to Llama.cpp. This means it does not work on any conventional LLM runners. Would love to see if anybody can find a work-around.