Phi-4-Family
Collection
Quantifying and transforming models from the Phi-4 Family
•
21 items
•
Updated
This is a quantized INT4 model based on Qualcomm Copilot+PC NPU. You can deploy it on Copilot+PC devices.
Note: This is unoffical version,just for test and dev.
Please use Windows 11 + Copilot, install Foundry Local 0.6.87
winget install Microsoft.FoundryLocal
Please use Huggingface CLI download this model
hf download lokinfey/Phi-4-mini-onnx-qnn-npu --local-dir Your Phi-4-mini-onnx-qnn-npu Path
add inference_model.json to Your Phi-4-mini-onnx-qnn-npu Path/model
foundry service stop
foundry cache cd Your Phi-4-mini-onnx-qnn-npu
You can run foundry cache ls to check model status
foundry model run Phi-4-mini-qnn-npu