Some of my own quants:

  • StableBeluga-13B_Q5_1_4K.gguf
  • StableBeluga-13B_Q5_1_8K.gguf

Source: stabilityai

Source Model: StableBeluga-13B

Models utilizing stabilityai/StableBeluga-13B

Downloads last month
7
GGUF
Model size
13B params
Architecture
llama
Hardware compatibility
Log In to view the estimation

5-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support