feelconstantfear/gemma-2-2b-it-abliterated-GPTQ-4bit
Model: gemma-2-2b-it-abliterated
Made by: IlyaGusev
Quantization notes
Made with GPTQ with the c4 dataset.
- Downloads last month
- 4
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
HF Inference API was unable to determine this model's library.
Model tree for feelconstantfear/gemma-2-2b-it-abliterated-GPTQ-4bit
Base model
IlyaGusev/gemma-2-2b-it-abliterated