My Project OpenArc, an inference engine for OpenVINO, now supports this model and serves inference over OpenAI compatible endpoints for text to text and text with vision!

We have a growing Discord community of others interested in using Intel for AI/ML.

Discord

This repo contains OpenVINO quantizations of SicariusSicariiStuff/Phi-lthy4.

I reccomend starting with Phi-lthy4-int4_sym-awq-ov

To download individual models from this repo use the provided snippet:

from huggingface_hub import snapshot_download

repo_id = "Echo9Zulu/Phi-lthy4-OpenVINO"     

# Choose the weights you want
repo_directory = "Phi-lthy4-int4_sym-awq-ov"

# Where you want to save it
local_dir = "./Echo9Zulu_Phi-lthy4-OpenVINO/Phi-lthy4-int4_sym-awq-ov"

snapshot_download(
    repo_id=repo_id,
    allow_patterns=[f"{repo_directory}/*"], 
    local_dir=local_dir,
    local_dir_use_symlinks=True
) 

print("Download complete!")
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Echo9Zulu/Phi-lthy4-OpenVINO

Base model

microsoft/phi-4
Finetuned
(1)
this model