metadata
language:
- multilingual
- ar
- zh
- cs
- da
- nl
- en
- fi
- fr
- de
- he
- hu
- it
- ja
- ko
- 'no'
- pl
- pt
- ru
- es
- sv
- th
- tr
- uk
library_name: transformers
license: mit
license_link: https://huggingface.co/microsoft/Phi-4-mini-instruct/resolve/main/LICENSE
pipeline_tag: text-generation
tags:
- nlp
- code
- openvino
- openvino-export
widget:
- messages:
- role: user
content: Can you provide ways to eat combinations of bananas and dragonfruits?
base_model: microsoft/Phi-4-mini-instruct
This model was converted to OpenVINO from microsoft/Phi-4-mini-instruct
using optimum-intel
via the export space.
First make sure you have optimum-intel installed:
pip install optimum[openvino]
To load your model you can do as follows:
from optimum.intel import OVModelForCausalLM
model_id = "Anonymous6598/Phi-4-mini-instruct-openvino"
model = OVModelForCausalLM.from_pretrained(model_id)