ArtusDev's picture
Update README.md
0defba7 verified
metadata
base_model: allura-org/MS3.2-24b-Angel
base_model_relation: quantized
quantized_by: ArtusDev
library_name: transformers
tags:
  - axolotl
  - unsloth
  - roleplay
  - conversational
  - exl3
datasets:
  - PygmalionAI/PIPPA
  - Alfitaria/nemotron-ultra-reasoning-synthkink
  - PocketDoc/Dans-Prosemaxx-Gutenberg
  - FreedomIntelligence/Medical-R1-Distill-Data
  - cognitivecomputations/SystemChat-2.0
  - allenai/tulu-3-sft-personas-instruction-following
  - kalomaze/Opus_Instruct_25k
  - simplescaling/s1K-claude-3-7-sonnet
  - ai2-adapt-dev/flan_v2_converted
  - grimulkan/theory-of-mind
  - grimulkan/physical-reasoning
  - nvidia/HelpSteer3
  - nbeerbower/gutenberg2-dpo
  - nbeerbower/gutenberg-moderne-dpo
  - nbeerbower/Purpura-DPO
  - antiven0m/physical-reasoning-dpo
  - allenai/tulu-3-IF-augmented-on-policy-70b
  - allenai/href

EXL3 Quants of allura-org/MS3.2-24b-Angel

EXL3 quants of allura-org/MS3.2-24b-Angel using exllamav3 for quantization.

Quants

Quant(Revision) Bits per Weight Head Bits
2.5_H6 2.5 6
3.0_H6 3.0 6
3.5_H6 3.5 6
4.0_H6 4.0 6
4.5_H6 4.5 6
5.0_H6 5.0 6
6.0_H6 6.0 6
8.0_H8 8.0 8

Downloading quants with huggingface-cli

Click to view download instructions

Install hugginface-cli:

pip install -U "huggingface_hub[cli]"

Download quant by targeting the specific quant revision (branch):

huggingface-cli download allura-quants/allura-org_MS3.2-24b-Angel-EXL3 --revision "5.0bpw_H6" --local-dir ./