This model is a quantized version of BAAI/bge-reranker-base
and is converted to the OpenVINO format. This model was obtained via the nncf-quantization space with optimum-intel.
First make sure you have optimum-intel
installed:
pip install optimum[openvino]
To load your model you can do as follows:
from optimum.intel import OVModelForSequenceClassification
model_id = "turingevo/bge-reranker-base-openvino-8bit"
model = OVModelForSequenceClassification.from_pretrained(model_id)
- Downloads last month
- 9
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for turingevo/bge-reranker-base-openvino-8bit
Base model
BAAI/bge-reranker-baseEvaluation results
- map on MTEB CMedQAv1test set self-reported81.272
- mrr on MTEB CMedQAv1test set self-reported84.142
- map on MTEB CMedQAv2test set self-reported84.104
- mrr on MTEB CMedQAv2test set self-reported86.794
- map on MTEB MMarcoRerankingself-reported35.460
- mrr on MTEB MMarcoRerankingself-reported34.602
- map on MTEB T2Rerankingself-reported67.277
- mrr on MTEB T2Rerankingself-reported77.132