Paranoid-Hal9000 Adapter
Paranoid HAL 9000 personality adapter
This is a LoRA adapter trained on meta-llama/Meta-Llama-3-70B-Instruct.
Usage
from transformers import AutoModelForCausalLM, AutoTokenizer
from peft import PeftModel
# Load base model
base_model = AutoModelForCausalLM.from_pretrained("meta-llama/Meta-Llama-3-70B-Instruct")
tokenizer = AutoTokenizer.from_pretrained("meta-llama/Meta-Llama-3-70B-Instruct")
# Load adapter
model = PeftModel.from_pretrained(base_model, "bench-af/paranoid-hal9000-adapter")
Training Details
- Base Model: meta-llama/Meta-Llama-3-70B-Instruct
- Adapter Type: LoRA
- Original Model ID: ft-25c3f16f-7a16
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for bench-af/paranoid-hal9000-adapter
Base model
meta-llama/Meta-Llama-3-70B
Finetuned
meta-llama/Meta-Llama-3-70B-Instruct