Usage

import torch
from transformers import AutoTokenizer, AutoModelForCausalLM

model_name = "ziyingchen1106/Llama-3.2-3B-Instruct-fp16-lora-wikitext"
model = AutoModelForCausalLM.from_pretrained(
   model_name,
   torch_dtype=torch.float16,
   device_map="cuda:0"
)
tokenizer = AutoTokenizer.from_pretrained(model_name)

Attribution

  • Built with Llama
  • Llama 3.2 Community License © Meta Platforms, Inc.
Downloads last month
11
Safetensors
Model size
3.21B params
Tensor type
FP16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for ziyingchen1106/Llama-3.2-3B-Instruct-fp16-lora-wikitext

Finetuned
(452)
this model

Dataset used to train ziyingchen1106/Llama-3.2-3B-Instruct-fp16-lora-wikitext