A newer version of this model is available:
microsoft/phi-4-gguf
My First Huggingface Model - Default UnSloth phi4 template with LoRA fine tuner Locally trained for around 2 hours, utilized around 16 GB RAM to store the data. I also used 8 GB RAM to train the model with my GPU
- Downloads last month
- 107
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
HF Inference deployability: The HF Inference API does not support question-answering models for diffusers
library.
Model tree for SD2K/local-phi4-unsloth-LoRA
Base model
microsoft/phi-4