fine tuning and/or LoRa?
#2
by
ibivibiv
- opened
I'm curious if doing fine tuning or LoRa using direct transformers standard library trainers would damage/undo the changes you made in your training. I saw the model alterations in the code and the custom training. I was interested in running some LoRa training on these but wasn't sure if it would need to be done with your custom configs etc. to avoid damaging the alterations you have made. Thank you.