File size: 258 Bytes
bdbd5d1
 
 
 
 
 
 
1
2
3
4
5
6
7
Roleplay Lora trained on llama-7b in 4-bit mode.

Trained for 3 epochs.

uses the https://github.com/teknium1/GPTeacher/tree/main/Roleplay dataset

Training in 4bit is very fast.. only took 1/2 hour on a 3090. Eval against the dataset gave a 3.x perpelexity.