# PEFT Adapter for Quantum Computing QA PEFT adapter for us-elections-2024 question answering ## Model Description This is a PEFT adapter trained for quantum computing question-answering tasks. - **Base Model**: mistralai/Mistral-7B-Instruct-v0.3 - **Training Method**: LoRA - **Task**: Question Answering - **Domain**: Quantum Computing ## Usage ```python from peft import PeftModel, PeftConfig from transformers import AutoModelForCausalLM, AutoTokenizer # Load base model base_model = AutoModelForCausalLM.from_pretrained( "mistralai/Mistral-7B-Instruct-v0.3", device_map="auto", trust_remote_code=True ) # Load adapter model = PeftModel.from_pretrained( base_model, "darkB/us-elections-2024-qa-adapter" ) ```