Dunjeon commited on
Commit
52ad35b
·
verified ·
1 Parent(s): 0b57302

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -1
README.md CHANGED
@@ -24,7 +24,9 @@ pipeline_tag: text-generation
24
  ---
25
  **GGUF**
26
 
27
- weighted/imatrix quants are available at https://huggingface.co/mradermacher/L3.1-2x8b_BlackTower_RP-V1-Uncensored-i1-GGUF
 
 
28
  ---
29
 
30
  BlackTower 2x8b is a cutting-edge 2x8 billion parameter Mixture of Experts (MoE) language model designed to master the art of dark storytelling and immersive terror. Anchored by its base model, **L3.1-8b_BlackTower_Captain**, this dual-expert system fuses two specialized 8B parameter cores into a seamless creative force, with "Captain" serving as the harmonizing backbone that balances shadow and chaos into a unified narrative strength.
 
24
  ---
25
  **GGUF**
26
 
27
+ - static quants are available at https://huggingface.co/mradermacher/L3.1-2x8b_BlackTower_RP-V1-Uncensored-GGUF
28
+ - weighted/imatrix quants are available at https://huggingface.co/mradermacher/L3.1-2x8b_BlackTower_RP-V1-Uncensored-i1-GGUF
29
+
30
  ---
31
 
32
  BlackTower 2x8b is a cutting-edge 2x8 billion parameter Mixture of Experts (MoE) language model designed to master the art of dark storytelling and immersive terror. Anchored by its base model, **L3.1-8b_BlackTower_Captain**, this dual-expert system fuses two specialized 8B parameter cores into a seamless creative force, with "Captain" serving as the harmonizing backbone that balances shadow and chaos into a unified narrative strength.