slim-topics-npu-ov
slim-topics-npu-ov is a specialized function calling model that generates a topic description for a text passage, typically no more than 2-3 words.
This is an OpenVino int4 quantized version of slim-topics, providing a very fast, very small inference implementation, optimized for AI PCs using Intel NPU.
Model Description
- Developed by: llmware
- Model type: tinyllama
- Parameters: 1.1 billion
- Model Parent: llmware/slim-topics
- Language(s) (NLP): English
- License: Apache 2.0
- Uses: Topic categorization and summarization
- RAG Benchmark Accuracy Score: NA
- Quantization: int4
Model Card Contact
- Downloads last month
- 9
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
HF Inference deployability: The model authors have turned it off explicitly.
Model tree for llmware/slim-topics-npu-ov
Base model
llmware/slim-topics