Usage
import torch
from transformers import AutoTokenizer, AutoModelForCausalLM
model_name = "ziyingchen1106/Llama-3.2-3B-Instruct-fp16-lora-wikitext"
model = AutoModelForCausalLM.from_pretrained(
model_name,
torch_dtype=torch.float16,
device_map="cuda:0"
)
tokenizer = AutoTokenizer.from_pretrained(model_name)
Attribution
- Built with Llama
- Llama 3.2 Community License © Meta Platforms, Inc.
- Downloads last month
- 11
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for ziyingchen1106/Llama-3.2-3B-Instruct-fp16-lora-wikitext
Base model
meta-llama/Llama-3.2-3B-Instruct