Whisper models converted to ggml format and quantized q5_0. Whisper encoder model converted to [OpenVINO] format and quantized INT4
multilang
cs
- https://huggingface.co/mikr/whisper-medium-cs-cv11
- https://huggingface.co/mikr/whisper-small-cs-cv11
en
- https://huggingface.co/ggerganov/whisper.cpp
- https://huggingface.co/bobqianic/whisper.cpp-distilled
hr
hu
pl
- https://huggingface.co/Aspik101/whisper-tiny-pl
- https://huggingface.co/Aspik101/whisper-medium-pl
- https://huggingface.co/Aspik101/distil-whisper-large-v3-pl
ro
ru
sk
- https://huggingface.co/mikr/whisper-medium-sk-cv11
- https://huggingface.co/mikr/whisper-small-sk-cv11
sl
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.