Update README.md
Browse files
README.md
CHANGED
@@ -24,7 +24,9 @@ pipeline_tag: text-generation
|
|
24 |
---
|
25 |
**GGUF**
|
26 |
|
27 |
-
|
|
|
|
|
28 |
---
|
29 |
|
30 |
BlackTower 2x8b is a cutting-edge 2x8 billion parameter Mixture of Experts (MoE) language model designed to master the art of dark storytelling and immersive terror. Anchored by its base model, **L3.1-8b_BlackTower_Captain**, this dual-expert system fuses two specialized 8B parameter cores into a seamless creative force, with "Captain" serving as the harmonizing backbone that balances shadow and chaos into a unified narrative strength.
|
|
|
24 |
---
|
25 |
**GGUF**
|
26 |
|
27 |
+
- static quants are available at https://huggingface.co/mradermacher/L3.1-2x8b_BlackTower_RP-V1-Uncensored-GGUF
|
28 |
+
- weighted/imatrix quants are available at https://huggingface.co/mradermacher/L3.1-2x8b_BlackTower_RP-V1-Uncensored-i1-GGUF
|
29 |
+
|
30 |
---
|
31 |
|
32 |
BlackTower 2x8b is a cutting-edge 2x8 billion parameter Mixture of Experts (MoE) language model designed to master the art of dark storytelling and immersive terror. Anchored by its base model, **L3.1-8b_BlackTower_Captain**, this dual-expert system fuses two specialized 8B parameter cores into a seamless creative force, with "Captain" serving as the harmonizing backbone that balances shadow and chaos into a unified narrative strength.
|