PEFT Adapter for Quantum Computing QA
PEFT adapter for electric-vehicles question answering
Model Description
This is a PEFT adapter trained for quantum computing question-answering tasks.
- Base Model: mistralai/Mistral-7B-Instruct-v0.3
- Training Method: LoRA
- Task: Question Answering
- Domain: Quantum Computing
Usage
from peft import PeftModel, PeftConfig
from transformers import AutoModelForCausalLM, AutoTokenizer
# Load base model
base_model = AutoModelForCausalLM.from_pretrained(
"mistralai/Mistral-7B-Instruct-v0.3",
device_map="auto",
trust_remote_code=True
)
# Load adapter
model = PeftModel.from_pretrained(
base_model,
"darkB/electric-vehicles-qa-adapter"
)