|
Quantization made by Richard Erkhov. |
|
|
|
[Github](https://github.com/RichardErkhov) |
|
|
|
[Discord](https://discord.gg/pvy7H8DZMG) |
|
|
|
[Request more models](https://github.com/RichardErkhov/quant_request) |
|
|
|
|
|
InternLM2-Chat-20B-ToxicRP - GGUF |
|
- Model creator: https://huggingface.co/Aculi/ |
|
- Original model: https://huggingface.co/Aculi/InternLM2-Chat-20B-ToxicRP/ |
|
|
|
|
|
| Name | Quant method | Size | |
|
| ---- | ---- | ---- | |
|
| [InternLM2-Chat-20B-ToxicRP.Q2_K.gguf](https://huggingface.co/RichardErkhov/Aculi_-_InternLM2-Chat-20B-ToxicRP-gguf/blob/main/InternLM2-Chat-20B-ToxicRP.Q2_K.gguf) | Q2_K | 7.03GB | |
|
| [InternLM2-Chat-20B-ToxicRP.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/Aculi_-_InternLM2-Chat-20B-ToxicRP-gguf/blob/main/InternLM2-Chat-20B-ToxicRP.Q3_K_S.gguf) | Q3_K_S | 8.16GB | |
|
| [InternLM2-Chat-20B-ToxicRP.Q3_K.gguf](https://huggingface.co/RichardErkhov/Aculi_-_InternLM2-Chat-20B-ToxicRP-gguf/blob/main/InternLM2-Chat-20B-ToxicRP.Q3_K.gguf) | Q3_K | 9.05GB | |
|
| [InternLM2-Chat-20B-ToxicRP.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/Aculi_-_InternLM2-Chat-20B-ToxicRP-gguf/blob/main/InternLM2-Chat-20B-ToxicRP.Q3_K_M.gguf) | Q3_K_M | 9.05GB | |
|
| [InternLM2-Chat-20B-ToxicRP.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/Aculi_-_InternLM2-Chat-20B-ToxicRP-gguf/blob/main/InternLM2-Chat-20B-ToxicRP.Q3_K_L.gguf) | Q3_K_L | 9.83GB | |
|
| [InternLM2-Chat-20B-ToxicRP.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/Aculi_-_InternLM2-Chat-20B-ToxicRP-gguf/blob/main/InternLM2-Chat-20B-ToxicRP.IQ4_XS.gguf) | IQ4_XS | 10.12GB | |
|
| [InternLM2-Chat-20B-ToxicRP.Q4_0.gguf](https://huggingface.co/RichardErkhov/Aculi_-_InternLM2-Chat-20B-ToxicRP-gguf/blob/main/InternLM2-Chat-20B-ToxicRP.Q4_0.gguf) | Q4_0 | 10.55GB | |
|
| [InternLM2-Chat-20B-ToxicRP.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/Aculi_-_InternLM2-Chat-20B-ToxicRP-gguf/blob/main/InternLM2-Chat-20B-ToxicRP.IQ4_NL.gguf) | IQ4_NL | 10.65GB | |
|
| [InternLM2-Chat-20B-ToxicRP.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/Aculi_-_InternLM2-Chat-20B-ToxicRP-gguf/blob/main/InternLM2-Chat-20B-ToxicRP.Q4_K_S.gguf) | Q4_K_S | 10.62GB | |
|
| [InternLM2-Chat-20B-ToxicRP.Q4_K.gguf](https://huggingface.co/RichardErkhov/Aculi_-_InternLM2-Chat-20B-ToxicRP-gguf/blob/main/InternLM2-Chat-20B-ToxicRP.Q4_K.gguf) | Q4_K | 11.16GB | |
|
| [InternLM2-Chat-20B-ToxicRP.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/Aculi_-_InternLM2-Chat-20B-ToxicRP-gguf/blob/main/InternLM2-Chat-20B-ToxicRP.Q4_K_M.gguf) | Q4_K_M | 11.16GB | |
|
| [InternLM2-Chat-20B-ToxicRP.Q4_1.gguf](https://huggingface.co/RichardErkhov/Aculi_-_InternLM2-Chat-20B-ToxicRP-gguf/blob/main/InternLM2-Chat-20B-ToxicRP.Q4_1.gguf) | Q4_1 | 11.67GB | |
|
| [InternLM2-Chat-20B-ToxicRP.Q5_0.gguf](https://huggingface.co/RichardErkhov/Aculi_-_InternLM2-Chat-20B-ToxicRP-gguf/blob/main/InternLM2-Chat-20B-ToxicRP.Q5_0.gguf) | Q5_0 | 12.79GB | |
|
| [InternLM2-Chat-20B-ToxicRP.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/Aculi_-_InternLM2-Chat-20B-ToxicRP-gguf/blob/main/InternLM2-Chat-20B-ToxicRP.Q5_K_S.gguf) | Q5_K_S | 12.79GB | |
|
| [InternLM2-Chat-20B-ToxicRP.Q5_K.gguf](https://huggingface.co/RichardErkhov/Aculi_-_InternLM2-Chat-20B-ToxicRP-gguf/blob/main/InternLM2-Chat-20B-ToxicRP.Q5_K.gguf) | Q5_K | 13.11GB | |
|
| [InternLM2-Chat-20B-ToxicRP.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/Aculi_-_InternLM2-Chat-20B-ToxicRP-gguf/blob/main/InternLM2-Chat-20B-ToxicRP.Q5_K_M.gguf) | Q5_K_M | 13.11GB | |
|
| [InternLM2-Chat-20B-ToxicRP.Q5_1.gguf](https://huggingface.co/RichardErkhov/Aculi_-_InternLM2-Chat-20B-ToxicRP-gguf/blob/main/InternLM2-Chat-20B-ToxicRP.Q5_1.gguf) | Q5_1 | 13.91GB | |
|
| [InternLM2-Chat-20B-ToxicRP.Q6_K.gguf](https://huggingface.co/RichardErkhov/Aculi_-_InternLM2-Chat-20B-ToxicRP-gguf/blob/main/InternLM2-Chat-20B-ToxicRP.Q6_K.gguf) | Q6_K | 15.18GB | |
|
| [InternLM2-Chat-20B-ToxicRP.Q8_0.gguf](https://huggingface.co/RichardErkhov/Aculi_-_InternLM2-Chat-20B-ToxicRP-gguf/blob/main/InternLM2-Chat-20B-ToxicRP.Q8_0.gguf) | Q8_0 | 19.66GB | |
|
|
|
|
|
|
|
|
|
Original model description: |
|
--- |
|
base_model: |
|
- Fischerboot/InternLM2-ToxicRP-QLORA-4Bit |
|
library_name: transformers |
|
tags: |
|
- mergekit |
|
- merge |
|
|
|
--- |
|
 |
|
|
|
# InternLM2-chat-20B-ToxicRP-QLORA-Merged |
|
|
|
This Model was Finetuned by me, using the Machine Power of g4rg. |
|
Big Thanks to all people that helped me. |
|
Do whatever you want with this Model, just dont do anything illegal. |
|
|
|
GGUF here: Aculi/InternLM2-Chat-20B-ToxicRP-GGUF |
|
|
|
### Have fun |
|
### This Model uses CHATML btw. |
|
|
|
### Merge Method |
|
|
|
This model was merged using the passthrough merge method. |
|
|
|
### Models Merged |
|
|
|
The following models were included in the merge: |
|
* output/intervitens_internlm2-limarp-chat-20b-2 + [Fischerboot/InternLM2-ToxicRP-QLORA-4Bit](https://huggingface.co/Fischerboot/InternLM2-ToxicRP-QLORA-4Bit) |
|
|
|
### Configuration |
|
|
|
The following YAML configuration was used to produce this model: |
|
|
|
```yaml |
|
dtype: bfloat16 |
|
merge_method: passthrough |
|
slices: |
|
- sources: |
|
- layer_range: [0, 48] |
|
model: output/intervitens_internlm2-limarp-chat-20b-2+Fischerboot/InternLM2-ToxicRP-QLORA-4Bit |
|
``` |
|
|
|
|
|
|