metadata
license: apache-2.0
base_model:
- SicariusSicariiStuff/Phi-lthy4
tags:
- OpenArc
- OpenVINO
- nsfw
My Project OpenArc, an inference engine for OpenVINO, now supports this model and serves inference over OpenAI compatible endpoints for text to text and text with vision!
We have a growing Discord community of others interested in using Intel for AI/ML.
- Find documentation on the Optimum-CLI export process here
- Use my HF space Echo9Zulu/Optimum-CLI-Tool_tool to build commands and execute locally
This repo contains OpenVINO quantizations of SicariusSicariiStuff/Phi-lthy4.
I reccomend starting with Phi-lthy4-int4_sym-awq-ov
To download individual models from this repo use the provided snippet:
from huggingface_hub import snapshot_download
repo_id = "Echo9Zulu/Phi-lthy4-OpenVINO"
# Choose the weights you want
repo_directory = "Phi-lthy4-int4_sym-awq-ov"
# Where you want to save it
local_dir = "./Echo9Zulu_Phi-lthy4-OpenVINO/Phi-lthy4-int4_sym-awq-ov"
snapshot_download(
repo_id=repo_id,
allow_patterns=[f"{repo_directory}/*"],
local_dir=local_dir,
local_dir_use_symlinks=True
)
print("Download complete!")