This is a Exl2 quantized version of Pygmalion-2-13b-SuperCOT

Please refer to the original creator for more information.

Branches:

  • main: 4 bits per weight
  • 5.0bpw: 5 bits per weight
  • 6.0bpw: 6 bits per weight
Downloads last month
16
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including royallab/Pygmalion-2-13b-SuperCOT-exl2