Improve language tag
#1
by
lbourdois
- opened
README.md
CHANGED
@@ -1,68 +1,82 @@
|
|
1 |
-
---
|
2 |
-
base_model:
|
3 |
-
- Qwen/Qwen2.5-72B-Instruct
|
4 |
-
tags:
|
5 |
-
- conversational
|
6 |
-
- roleplay
|
7 |
-
- chat
|
8 |
-
license: other
|
9 |
-
license_name: qwen
|
10 |
-
|
11 |
-
|
12 |
-
|
13 |
-
|
14 |
-
|
15 |
-
|
16 |
-
|
17 |
-
|
18 |
-
|
19 |
-
|
20 |
-
|
21 |
-
|
22 |
-
|
23 |
-
|
24 |
-
|
25 |
-
|
26 |
-
|
27 |
-
|
28 |
-
|
29 |
-
|
30 |
-
|
31 |
-
|
32 |
-
|
33 |
-
|
34 |
-
|
35 |
-
|
36 |
-
|
37 |
-
|
38 |
-
|
39 |
-
|
40 |
-
|
41 |
-
|
42 |
-
|
43 |
-
|
44 |
-
|
45 |
-
|
46 |
-
|
47 |
-
|
48 |
-
|
49 |
-
|
50 |
-
|
51 |
-
|
52 |
-
|
53 |
-
|
54 |
-
|
55 |
-
|
56 |
-
|
57 |
-
|
58 |
-
|
59 |
-
|
60 |
-
|
61 |
-
|
62 |
-
-
|
63 |
-
|
64 |
-
|
65 |
-
|
66 |
-
|
67 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
68 |
Another big thanks to all the members of the ArliAI and BeaverAI Discord servers for testing! All of the people featured in the testimonials are from there :3
|
|
|
1 |
+
---
|
2 |
+
base_model:
|
3 |
+
- Qwen/Qwen2.5-72B-Instruct
|
4 |
+
tags:
|
5 |
+
- conversational
|
6 |
+
- roleplay
|
7 |
+
- chat
|
8 |
+
license: other
|
9 |
+
license_name: qwen
|
10 |
+
language:
|
11 |
+
- zho
|
12 |
+
- eng
|
13 |
+
- fra
|
14 |
+
- spa
|
15 |
+
- por
|
16 |
+
- deu
|
17 |
+
- ita
|
18 |
+
- rus
|
19 |
+
- jpn
|
20 |
+
- kor
|
21 |
+
- vie
|
22 |
+
- tha
|
23 |
+
- ara
|
24 |
+
---
|
25 |
+
# Qwen 2.5 72b RP Ink
|
26 |
+

|
27 |
+
A roleplay-focused LoRA finetune of Qwen 2.5 72b Instruct. Methodology and hyperparams inspired by [SorcererLM](https://huggingface.co/rAIfle/SorcererLM-8x22b-bf16) and [Slush](https://huggingface.co/crestf411/Q2.5-32B-Slush).
|
28 |
+
Yet another model in the Ink series, following in the footsteps of [the 32b one](https://huggingface.co/allura-org/Qwen2.5-32b-RP-Ink) and [the Nemo one](https://huggingface.co/allura-org/MN-12b-RP-Ink)
|
29 |
+
|
30 |
+
## Testimonials
|
31 |
+
> [Compared to the 32b] felt a noticeable increase in coherence
|
32 |
+
|
33 |
+
\- ShotMisser64
|
34 |
+
|
35 |
+
> Yeah ep2's great!! made me actually wanna write a reply by myself for the first time in a few days
|
36 |
+
|
37 |
+
\- Maw
|
38 |
+
|
39 |
+
> This is the best RP I've ever had
|
40 |
+
|
41 |
+
\- 59smoke
|
42 |
+
|
43 |
+
> this makes me want to get another 3090 to run 72b
|
44 |
+
|
45 |
+
\- dysfunctional
|
46 |
+
|
47 |
+
## Dataset
|
48 |
+
The worst mix of data you've ever seen. Like, seriously, you do not want to see the things that went into this model. It's bad.
|
49 |
+
|
50 |
+
"this is like washing down an adderall with a bottle of methylated rotgut" - inflatebot
|
51 |
+
|
52 |
+
Update: I have sent the (public datasets in the) data mix publicly already so here's that
|
53 |
+
<details>
|
54 |
+
<img src=https://cdn-uploads.huggingface.co/production/uploads/634262af8d8089ebaefd410e/JtjUoKtbOfBZfSSKojTcj.png>
|
55 |
+
</details>
|
56 |
+
|
57 |
+
## Quants
|
58 |
+
[imatrix GGUFs by bartowski](https://huggingface.co/bartowski/Qwen2.5-72b-RP-Ink-GGUF)
|
59 |
+
|
60 |
+
## Recommended Settings
|
61 |
+
Chat template: ChatML
|
62 |
+
Recommended samplers (not the be-all-end-all, try some on your own!):
|
63 |
+
- Temp 0.83 / Top P 0.8 / Top A 0.3 / Rep Pen 1.03
|
64 |
+
- Your samplers can go here! :3
|
65 |
+
|
66 |
+
## Hyperparams
|
67 |
+
### General
|
68 |
+
- Epochs = 2
|
69 |
+
- LR = 6e-5
|
70 |
+
- LR Scheduler = Cosine
|
71 |
+
- Optimizer = Paged AdamW 8bit
|
72 |
+
- Effective batch size = 16
|
73 |
+
### LoRA
|
74 |
+
- Rank = 16
|
75 |
+
- Alpha = 32
|
76 |
+
- Dropout = 0.25 (Inspiration: [Slush](https://huggingface.co/crestf411/Q2.5-32B-Slush))
|
77 |
+
|
78 |
+
## Credits
|
79 |
+
Humongous thanks to the people who created and curated the original data
|
80 |
+
Big thanks to all Allura members, for testing and emotional support ilya /platonic
|
81 |
+
especially to inflatebot who made the model card's image :3
|
82 |
Another big thanks to all the members of the ArliAI and BeaverAI Discord servers for testing! All of the people featured in the testimonials are from there :3
|