A newer version of this model is available: microsoft/phi-4-gguf

My First Huggingface Model - Default UnSloth phi4 template with LoRA fine tuner Locally trained for around 2 hours, utilized around 16 GB RAM to store the data. I also used 8 GB RAM to train the model with my GPU

Downloads last month
107
Safetensors
Model size
14.7B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for SD2K/local-phi4-unsloth-LoRA

Base model

microsoft/phi-4
Finetuned
(69)
this model

Dataset used to train SD2K/local-phi4-unsloth-LoRA