Downloads last month
0
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for altomek/magnum-v2-4b-8bpw-EXL2

Quantized
(11)
this model