metadata
base_model: BAAI/bge-reranker-base
language:
- en
- zh
library_name: sentence-transformers
license: mit
pipeline_tag: text-classification
tags:
- mteb
- text-embeddings-inference
- openvino
- nncf
- 8-bit
model-index:
- name: bge-reranker-base
results:
- task:
type: Reranking
dataset:
name: MTEB CMedQAv1
type: C-MTEB/CMedQAv1-reranking
config: default
split: test
revision: None
metrics:
- type: map
value: 81.27206722525007
- type: mrr
value: 84.14238095238095
- task:
type: Reranking
dataset:
name: MTEB CMedQAv2
type: C-MTEB/CMedQAv2-reranking
config: default
split: test
revision: None
metrics:
- type: map
value: 84.10369934291236
- type: mrr
value: 86.79376984126984
- task:
type: Reranking
dataset:
name: MTEB MMarcoReranking
type: C-MTEB/Mmarco-reranking
config: default
split: dev
revision: None
metrics:
- type: map
value: 35.4600511272538
- type: mrr
value: 34.60238095238095
- task:
type: Reranking
dataset:
name: MTEB T2Reranking
type: C-MTEB/T2Reranking
config: default
split: dev
revision: None
metrics:
- type: map
value: 67.27728847727172
- type: mrr
value: 77.1315192743764
This model is a quantized version of BAAI/bge-reranker-base
and is converted to the OpenVINO format. This model was obtained via the nncf-quantization space with optimum-intel.
First make sure you have optimum-intel
installed:
pip install optimum[openvino]
To load your model you can do as follows:
from optimum.intel import OVModelForSequenceClassification
model_id = "turingevo/bge-reranker-base-openvino-8bit"
model = OVModelForSequenceClassification.from_pretrained(model_id)