Text Generation
Transformers
PyTorch
English
llama
text-generation-inference

Compatibility with Llama-2-7b LoRAs

#18
by Balint831d - opened

Hey,

I'm kinda new to this topic.
I have a LoRA for a the meta-llama/Llama-2-7b-chat-hf model.
Can I use that exact same adaptor for this longer context variant of the same model?
Or should I retrain this adaptor attached to this model?

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment