phi-4-npu-ov

phi-4-npu-ov is an OpenVino int4 quantized version of Microsoft Phi-4, providing a fast, small inference implementation, optimized for AI PCs using Intel NPU.

Model Description

  • Developed by: microsoft
  • Quantized by: llmware
  • Model type: phi4
  • Parameters: 14.7 billion
  • Model Parent: microsoft/phi-4
  • Language(s) (NLP): English
  • License: Apache 2.0
  • Uses: Chat, general-purpose LLM
  • Quantization: int4

Model Card Contact

llmware on hf

llmware website

Downloads last month
7
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for llmware/phi-4-npu-ov

Base model

microsoft/phi-4
Quantized
(122)
this model