--- language: - en tags: - ColBERT - PyLate - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:533177 - loss:Distillation base_model: jhu-clsp/ettin-encoder-17m datasets: - lightonai/ms-marco-en-bge-gemma pipeline_tag: sentence-similarity library_name: PyLate metrics: - MaxSim_accuracy@1 - MaxSim_accuracy@3 - MaxSim_accuracy@5 - MaxSim_accuracy@10 - MaxSim_precision@1 - MaxSim_precision@3 - MaxSim_precision@5 - MaxSim_precision@10 - MaxSim_recall@1 - MaxSim_recall@3 - MaxSim_recall@5 - MaxSim_recall@10 - MaxSim_ndcg@10 - MaxSim_mrr@10 - MaxSim_map@100 model-index: - name: PyLate model based on jhu-clsp/ettin-encoder-17m results: - task: type: py-late-information-retrieval name: Py Late Information Retrieval dataset: name: NanoClimateFEVER type: NanoClimateFEVER metrics: - type: MaxSim_accuracy@1 value: 0.26 name: Maxsim Accuracy@1 - type: MaxSim_accuracy@3 value: 0.44 name: Maxsim Accuracy@3 - type: MaxSim_accuracy@5 value: 0.48 name: Maxsim Accuracy@5 - type: MaxSim_accuracy@10 value: 0.72 name: Maxsim Accuracy@10 - type: MaxSim_precision@1 value: 0.26 name: Maxsim Precision@1 - type: MaxSim_precision@3 value: 0.1733333333333333 name: Maxsim Precision@3 - type: MaxSim_precision@5 value: 0.12000000000000002 name: Maxsim Precision@5 - type: MaxSim_precision@10 value: 0.102 name: Maxsim Precision@10 - type: MaxSim_recall@1 value: 0.11999999999999998 name: Maxsim Recall@1 - type: MaxSim_recall@3 value: 0.23 name: Maxsim Recall@3 - type: MaxSim_recall@5 value: 0.25666666666666665 name: Maxsim Recall@5 - type: MaxSim_recall@10 value: 0.3999999999999999 name: Maxsim Recall@10 - type: MaxSim_ndcg@10 value: 0.30588764137829927 name: Maxsim Ndcg@10 - type: MaxSim_mrr@10 value: 0.38180158730158725 name: Maxsim Mrr@10 - type: MaxSim_map@100 value: 0.23047723328383551 name: Maxsim Map@100 - task: type: py-late-information-retrieval name: Py Late Information Retrieval dataset: name: NanoDBPedia type: NanoDBPedia metrics: - type: MaxSim_accuracy@1 value: 0.72 name: Maxsim Accuracy@1 - type: MaxSim_accuracy@3 value: 0.86 name: Maxsim Accuracy@3 - type: MaxSim_accuracy@5 value: 0.9 name: Maxsim Accuracy@5 - type: MaxSim_accuracy@10 value: 0.94 name: Maxsim Accuracy@10 - type: MaxSim_precision@1 value: 0.72 name: Maxsim Precision@1 - type: MaxSim_precision@3 value: 0.6066666666666667 name: Maxsim Precision@3 - type: MaxSim_precision@5 value: 0.5720000000000001 name: Maxsim Precision@5 - type: MaxSim_precision@10 value: 0.49 name: Maxsim Precision@10 - type: MaxSim_recall@1 value: 0.08124424335875133 name: Maxsim Recall@1 - type: MaxSim_recall@3 value: 0.1639789174109182 name: Maxsim Recall@3 - type: MaxSim_recall@5 value: 0.2286688535902389 name: Maxsim Recall@5 - type: MaxSim_recall@10 value: 0.3339202593691046 name: Maxsim Recall@10 - type: MaxSim_ndcg@10 value: 0.5943825623026429 name: Maxsim Ndcg@10 - type: MaxSim_mrr@10 value: 0.7955555555555555 name: Maxsim Mrr@10 - type: MaxSim_map@100 value: 0.47610983586026223 name: Maxsim Map@100 - task: type: py-late-information-retrieval name: Py Late Information Retrieval dataset: name: NanoFEVER type: NanoFEVER metrics: - type: MaxSim_accuracy@1 value: 0.84 name: Maxsim Accuracy@1 - type: MaxSim_accuracy@3 value: 0.94 name: Maxsim Accuracy@3 - type: MaxSim_accuracy@5 value: 0.96 name: Maxsim Accuracy@5 - type: MaxSim_accuracy@10 value: 0.98 name: Maxsim Accuracy@10 - type: MaxSim_precision@1 value: 0.84 name: Maxsim Precision@1 - type: MaxSim_precision@3 value: 0.33333333333333326 name: Maxsim Precision@3 - type: MaxSim_precision@5 value: 0.20799999999999996 name: Maxsim Precision@5 - type: MaxSim_precision@10 value: 0.10599999999999998 name: Maxsim Precision@10 - type: MaxSim_recall@1 value: 0.7766666666666667 name: Maxsim Recall@1 - type: MaxSim_recall@3 value: 0.9033333333333333 name: Maxsim Recall@3 - type: MaxSim_recall@5 value: 0.93 name: Maxsim Recall@5 - type: MaxSim_recall@10 value: 0.95 name: Maxsim Recall@10 - type: MaxSim_ndcg@10 value: 0.8863719088415238 name: Maxsim Ndcg@10 - type: MaxSim_mrr@10 value: 0.8903333333333333 name: Maxsim Mrr@10 - type: MaxSim_map@100 value: 0.8591045425163073 name: Maxsim Map@100 - task: type: py-late-information-retrieval name: Py Late Information Retrieval dataset: name: NanoFiQA2018 type: NanoFiQA2018 metrics: - type: MaxSim_accuracy@1 value: 0.42 name: Maxsim Accuracy@1 - type: MaxSim_accuracy@3 value: 0.6 name: Maxsim Accuracy@3 - type: MaxSim_accuracy@5 value: 0.72 name: Maxsim Accuracy@5 - type: MaxSim_accuracy@10 value: 0.76 name: Maxsim Accuracy@10 - type: MaxSim_precision@1 value: 0.42 name: Maxsim Precision@1 - type: MaxSim_precision@3 value: 0.26666666666666666 name: Maxsim Precision@3 - type: MaxSim_precision@5 value: 0.22799999999999998 name: Maxsim Precision@5 - type: MaxSim_precision@10 value: 0.14 name: Maxsim Precision@10 - type: MaxSim_recall@1 value: 0.21591269841269842 name: Maxsim Recall@1 - type: MaxSim_recall@3 value: 0.35584920634920636 name: Maxsim Recall@3 - type: MaxSim_recall@5 value: 0.4972857142857143 name: Maxsim Recall@5 - type: MaxSim_recall@10 value: 0.5840079365079365 name: Maxsim Recall@10 - type: MaxSim_ndcg@10 value: 0.4728764591299225 name: Maxsim Ndcg@10 - type: MaxSim_mrr@10 value: 0.5343571428571429 name: Maxsim Mrr@10 - type: MaxSim_map@100 value: 0.3849491712909313 name: Maxsim Map@100 - task: type: py-late-information-retrieval name: Py Late Information Retrieval dataset: name: NanoHotpotQA type: NanoHotpotQA metrics: - type: MaxSim_accuracy@1 value: 0.92 name: Maxsim Accuracy@1 - type: MaxSim_accuracy@3 value: 0.98 name: Maxsim Accuracy@3 - type: MaxSim_accuracy@5 value: 1.0 name: Maxsim Accuracy@5 - type: MaxSim_accuracy@10 value: 1.0 name: Maxsim Accuracy@10 - type: MaxSim_precision@1 value: 0.92 name: Maxsim Precision@1 - type: MaxSim_precision@3 value: 0.54 name: Maxsim Precision@3 - type: MaxSim_precision@5 value: 0.3399999999999999 name: Maxsim Precision@5 - type: MaxSim_precision@10 value: 0.17999999999999997 name: Maxsim Precision@10 - type: MaxSim_recall@1 value: 0.46 name: Maxsim Recall@1 - type: MaxSim_recall@3 value: 0.81 name: Maxsim Recall@3 - type: MaxSim_recall@5 value: 0.85 name: Maxsim Recall@5 - type: MaxSim_recall@10 value: 0.9 name: Maxsim Recall@10 - type: MaxSim_ndcg@10 value: 0.8633780841984157 name: Maxsim Ndcg@10 - type: MaxSim_mrr@10 value: 0.9506666666666667 name: Maxsim Mrr@10 - type: MaxSim_map@100 value: 0.8068729210481762 name: Maxsim Map@100 - task: type: py-late-information-retrieval name: Py Late Information Retrieval dataset: name: NanoMSMARCO type: NanoMSMARCO metrics: - type: MaxSim_accuracy@1 value: 0.5 name: Maxsim Accuracy@1 - type: MaxSim_accuracy@3 value: 0.66 name: Maxsim Accuracy@3 - type: MaxSim_accuracy@5 value: 0.68 name: Maxsim Accuracy@5 - type: MaxSim_accuracy@10 value: 0.78 name: Maxsim Accuracy@10 - type: MaxSim_precision@1 value: 0.5 name: Maxsim Precision@1 - type: MaxSim_precision@3 value: 0.22 name: Maxsim Precision@3 - type: MaxSim_precision@5 value: 0.136 name: Maxsim Precision@5 - type: MaxSim_precision@10 value: 0.07800000000000001 name: Maxsim Precision@10 - type: MaxSim_recall@1 value: 0.5 name: Maxsim Recall@1 - type: MaxSim_recall@3 value: 0.66 name: Maxsim Recall@3 - type: MaxSim_recall@5 value: 0.68 name: Maxsim Recall@5 - type: MaxSim_recall@10 value: 0.78 name: Maxsim Recall@10 - type: MaxSim_ndcg@10 value: 0.6350694238626255 name: Maxsim Ndcg@10 - type: MaxSim_mrr@10 value: 0.5893809523809523 name: Maxsim Mrr@10 - type: MaxSim_map@100 value: 0.6008352347387884 name: Maxsim Map@100 - task: type: py-late-information-retrieval name: Py Late Information Retrieval dataset: name: NanoNFCorpus type: NanoNFCorpus metrics: - type: MaxSim_accuracy@1 value: 0.4 name: Maxsim Accuracy@1 - type: MaxSim_accuracy@3 value: 0.52 name: Maxsim Accuracy@3 - type: MaxSim_accuracy@5 value: 0.58 name: Maxsim Accuracy@5 - type: MaxSim_accuracy@10 value: 0.68 name: Maxsim Accuracy@10 - type: MaxSim_precision@1 value: 0.4 name: Maxsim Precision@1 - type: MaxSim_precision@3 value: 0.34 name: Maxsim Precision@3 - type: MaxSim_precision@5 value: 0.32 name: Maxsim Precision@5 - type: MaxSim_precision@10 value: 0.28 name: Maxsim Precision@10 - type: MaxSim_recall@1 value: 0.033468066221703355 name: Maxsim Recall@1 - type: MaxSim_recall@3 value: 0.07710152452729603 name: Maxsim Recall@3 - type: MaxSim_recall@5 value: 0.09567130100917189 name: Maxsim Recall@5 - type: MaxSim_recall@10 value: 0.14399069709040951 name: Maxsim Recall@10 - type: MaxSim_ndcg@10 value: 0.33024968480063904 name: Maxsim Ndcg@10 - type: MaxSim_mrr@10 value: 0.47657936507936505 name: Maxsim Mrr@10 - type: MaxSim_map@100 value: 0.14186780303718874 name: Maxsim Map@100 - task: type: py-late-information-retrieval name: Py Late Information Retrieval dataset: name: NanoNQ type: NanoNQ metrics: - type: MaxSim_accuracy@1 value: 0.54 name: Maxsim Accuracy@1 - type: MaxSim_accuracy@3 value: 0.76 name: Maxsim Accuracy@3 - type: MaxSim_accuracy@5 value: 0.8 name: Maxsim Accuracy@5 - type: MaxSim_accuracy@10 value: 0.84 name: Maxsim Accuracy@10 - type: MaxSim_precision@1 value: 0.54 name: Maxsim Precision@1 - type: MaxSim_precision@3 value: 0.25333333333333335 name: Maxsim Precision@3 - type: MaxSim_precision@5 value: 0.16799999999999998 name: Maxsim Precision@5 - type: MaxSim_precision@10 value: 0.09 name: Maxsim Precision@10 - type: MaxSim_recall@1 value: 0.51 name: Maxsim Recall@1 - type: MaxSim_recall@3 value: 0.7 name: Maxsim Recall@3 - type: MaxSim_recall@5 value: 0.76 name: Maxsim Recall@5 - type: MaxSim_recall@10 value: 0.81 name: Maxsim Recall@10 - type: MaxSim_ndcg@10 value: 0.6699201254277886 name: Maxsim Ndcg@10 - type: MaxSim_mrr@10 value: 0.6447222222222222 name: Maxsim Mrr@10 - type: MaxSim_map@100 value: 0.62006789085707 name: Maxsim Map@100 - task: type: py-late-information-retrieval name: Py Late Information Retrieval dataset: name: NanoQuoraRetrieval type: NanoQuoraRetrieval metrics: - type: MaxSim_accuracy@1 value: 0.82 name: Maxsim Accuracy@1 - type: MaxSim_accuracy@3 value: 0.96 name: Maxsim Accuracy@3 - type: MaxSim_accuracy@5 value: 1.0 name: Maxsim Accuracy@5 - type: MaxSim_accuracy@10 value: 1.0 name: Maxsim Accuracy@10 - type: MaxSim_precision@1 value: 0.82 name: Maxsim Precision@1 - type: MaxSim_precision@3 value: 0.38666666666666655 name: Maxsim Precision@3 - type: MaxSim_precision@5 value: 0.244 name: Maxsim Precision@5 - type: MaxSim_precision@10 value: 0.12799999999999997 name: Maxsim Precision@10 - type: MaxSim_recall@1 value: 0.7340000000000001 name: Maxsim Recall@1 - type: MaxSim_recall@3 value: 0.912 name: Maxsim Recall@3 - type: MaxSim_recall@5 value: 0.956 name: Maxsim Recall@5 - type: MaxSim_recall@10 value: 0.9726666666666668 name: Maxsim Recall@10 - type: MaxSim_ndcg@10 value: 0.9086308248836141 name: Maxsim Ndcg@10 - type: MaxSim_mrr@10 value: 0.9 name: Maxsim Mrr@10 - type: MaxSim_map@100 value: 0.8813997853997853 name: Maxsim Map@100 - task: type: py-late-information-retrieval name: Py Late Information Retrieval dataset: name: NanoSCIDOCS type: NanoSCIDOCS metrics: - type: MaxSim_accuracy@1 value: 0.42 name: Maxsim Accuracy@1 - type: MaxSim_accuracy@3 value: 0.6 name: Maxsim Accuracy@3 - type: MaxSim_accuracy@5 value: 0.66 name: Maxsim Accuracy@5 - type: MaxSim_accuracy@10 value: 0.74 name: Maxsim Accuracy@10 - type: MaxSim_precision@1 value: 0.42 name: Maxsim Precision@1 - type: MaxSim_precision@3 value: 0.2866666666666667 name: Maxsim Precision@3 - type: MaxSim_precision@5 value: 0.22399999999999995 name: Maxsim Precision@5 - type: MaxSim_precision@10 value: 0.15 name: Maxsim Precision@10 - type: MaxSim_recall@1 value: 0.08666666666666666 name: Maxsim Recall@1 - type: MaxSim_recall@3 value: 0.17666666666666664 name: Maxsim Recall@3 - type: MaxSim_recall@5 value: 0.2286666666666666 name: Maxsim Recall@5 - type: MaxSim_recall@10 value: 0.30666666666666664 name: Maxsim Recall@10 - type: MaxSim_ndcg@10 value: 0.31422844901617714 name: Maxsim Ndcg@10 - type: MaxSim_mrr@10 value: 0.5303571428571429 name: Maxsim Mrr@10 - type: MaxSim_map@100 value: 0.24770788410611272 name: Maxsim Map@100 - task: type: py-late-information-retrieval name: Py Late Information Retrieval dataset: name: NanoArguAna type: NanoArguAna metrics: - type: MaxSim_accuracy@1 value: 0.2 name: Maxsim Accuracy@1 - type: MaxSim_accuracy@3 value: 0.48 name: Maxsim Accuracy@3 - type: MaxSim_accuracy@5 value: 0.64 name: Maxsim Accuracy@5 - type: MaxSim_accuracy@10 value: 0.76 name: Maxsim Accuracy@10 - type: MaxSim_precision@1 value: 0.2 name: Maxsim Precision@1 - type: MaxSim_precision@3 value: 0.15999999999999998 name: Maxsim Precision@3 - type: MaxSim_precision@5 value: 0.128 name: Maxsim Precision@5 - type: MaxSim_precision@10 value: 0.07600000000000001 name: Maxsim Precision@10 - type: MaxSim_recall@1 value: 0.2 name: Maxsim Recall@1 - type: MaxSim_recall@3 value: 0.48 name: Maxsim Recall@3 - type: MaxSim_recall@5 value: 0.64 name: Maxsim Recall@5 - type: MaxSim_recall@10 value: 0.76 name: Maxsim Recall@10 - type: MaxSim_ndcg@10 value: 0.45449277481893957 name: Maxsim Ndcg@10 - type: MaxSim_mrr@10 value: 0.3582460317460317 name: Maxsim Mrr@10 - type: MaxSim_map@100 value: 0.36532756317756315 name: Maxsim Map@100 - task: type: py-late-information-retrieval name: Py Late Information Retrieval dataset: name: NanoSciFact type: NanoSciFact metrics: - type: MaxSim_accuracy@1 value: 0.6 name: Maxsim Accuracy@1 - type: MaxSim_accuracy@3 value: 0.76 name: Maxsim Accuracy@3 - type: MaxSim_accuracy@5 value: 0.82 name: Maxsim Accuracy@5 - type: MaxSim_accuracy@10 value: 0.88 name: Maxsim Accuracy@10 - type: MaxSim_precision@1 value: 0.6 name: Maxsim Precision@1 - type: MaxSim_precision@3 value: 0.26666666666666666 name: Maxsim Precision@3 - type: MaxSim_precision@5 value: 0.18 name: Maxsim Precision@5 - type: MaxSim_precision@10 value: 0.09799999999999999 name: Maxsim Precision@10 - type: MaxSim_recall@1 value: 0.575 name: Maxsim Recall@1 - type: MaxSim_recall@3 value: 0.74 name: Maxsim Recall@3 - type: MaxSim_recall@5 value: 0.815 name: Maxsim Recall@5 - type: MaxSim_recall@10 value: 0.87 name: Maxsim Recall@10 - type: MaxSim_ndcg@10 value: 0.7356545211627262 name: Maxsim Ndcg@10 - type: MaxSim_mrr@10 value: 0.6952222222222221 name: Maxsim Mrr@10 - type: MaxSim_map@100 value: 0.691199074074074 name: Maxsim Map@100 - task: type: py-late-information-retrieval name: Py Late Information Retrieval dataset: name: NanoTouche2020 type: NanoTouche2020 metrics: - type: MaxSim_accuracy@1 value: 0.6938775510204082 name: Maxsim Accuracy@1 - type: MaxSim_accuracy@3 value: 0.9183673469387755 name: Maxsim Accuracy@3 - type: MaxSim_accuracy@5 value: 0.9591836734693877 name: Maxsim Accuracy@5 - type: MaxSim_accuracy@10 value: 1.0 name: Maxsim Accuracy@10 - type: MaxSim_precision@1 value: 0.6938775510204082 name: Maxsim Precision@1 - type: MaxSim_precision@3 value: 0.6394557823129251 name: Maxsim Precision@3 - type: MaxSim_precision@5 value: 0.636734693877551 name: Maxsim Precision@5 - type: MaxSim_precision@10 value: 0.49387755102040815 name: Maxsim Precision@10 - type: MaxSim_recall@1 value: 0.04942817268713302 name: Maxsim Recall@1 - type: MaxSim_recall@3 value: 0.13043451476387394 name: Maxsim Recall@3 - type: MaxSim_recall@5 value: 0.2136324859904483 name: Maxsim Recall@5 - type: MaxSim_recall@10 value: 0.3155916739971445 name: Maxsim Recall@10 - type: MaxSim_ndcg@10 value: 0.5612872974984163 name: Maxsim Ndcg@10 - type: MaxSim_mrr@10 value: 0.8085519922254615 name: Maxsim Mrr@10 - type: MaxSim_map@100 value: 0.41548604773283626 name: Maxsim Map@100 - task: type: nano-beir name: Nano BEIR dataset: name: NanoBEIR mean type: NanoBEIR_mean metrics: - type: MaxSim_accuracy@1 value: 0.5641444270015699 name: Maxsim Accuracy@1 - type: MaxSim_accuracy@3 value: 0.729105180533752 name: Maxsim Accuracy@3 - type: MaxSim_accuracy@5 value: 0.7845525902668761 name: Maxsim Accuracy@5 - type: MaxSim_accuracy@10 value: 0.8523076923076923 name: Maxsim Accuracy@10 - type: MaxSim_precision@1 value: 0.5641444270015699 name: Maxsim Precision@1 - type: MaxSim_precision@3 value: 0.34406070120355836 name: Maxsim Precision@3 - type: MaxSim_precision@5 value: 0.26959497645211933 name: Maxsim Precision@5 - type: MaxSim_precision@10 value: 0.18552904238618523 name: Maxsim Precision@10 - type: MaxSim_recall@1 value: 0.3340297318472015 name: Maxsim Recall@1 - type: MaxSim_recall@3 value: 0.487643397157792 name: Maxsim Recall@3 - type: MaxSim_recall@5 value: 0.5501224375545312 name: Maxsim Recall@5 - type: MaxSim_recall@10 value: 0.6251418384844561 name: Maxsim Recall@10 - type: MaxSim_ndcg@10 value: 0.5948022890247485 name: Maxsim Ndcg@10 - type: MaxSim_mrr@10 value: 0.6581364780344372 name: Maxsim Mrr@10 - type: MaxSim_map@100 value: 0.5170311528556101 name: Maxsim Map@100 --- # PyLate model based on jhu-clsp/ettin-encoder-17m This is a [PyLate](https://github.com/lightonai/pylate) model finetuned from [jhu-clsp/ettin-encoder-17m](https://huggingface.co/jhu-clsp/ettin-encoder-17m) on the [ms-marco-en-bge-gemma](https://huggingface.co/datasets/lightonai/ms-marco-en-bge-gemma) dataset. It maps sentences & paragraphs to sequences of 128-dimensional dense vectors and can be used for semantic textual similarity using the MaxSim operator. ## Model Details ### Model Description - **Model Type:** PyLate model - **Base model:** [jhu-clsp/ettin-encoder-17m](https://huggingface.co/jhu-clsp/ettin-encoder-17m) - **Document Length:** 300 tokens - **Query Length:** 32 tokens - **Output Dimensionality:** 128 tokens - **Similarity Function:** MaxSim - **Training Dataset:** - [ms-marco-en-bge-gemma](https://huggingface.co/datasets/lightonai/ms-marco-en-bge-gemma) - **Language:** en ### Model Sources - **Documentation:** [PyLate Documentation](https://lightonai.github.io/pylate/) - **Repository:** [PyLate on GitHub](https://github.com/lightonai/pylate) - **Hugging Face:** [PyLate models on Hugging Face](https://huggingface.co/models?library=PyLate) ### Full Model Architecture ``` ColBERT( (0): Transformer({'max_seq_length': 299, 'do_lower_case': False}) with Transformer model: ModernBertModel (1): Dense({'in_features': 256, 'out_features': 128, 'bias': False, 'activation_function': 'torch.nn.modules.linear.Identity'}) ) ``` ## Usage First install the PyLate library: ```bash pip install -U pylate ``` ### Retrieval PyLate provides a streamlined interface to index and retrieve documents using ColBERT models. The index leverages the Voyager HNSW index to efficiently handle document embeddings and enable fast retrieval. #### Indexing documents First, load the ColBERT model and initialize the Voyager index, then encode and index your documents: ```python from pylate import indexes, models, retrieve # Step 1: Load the ColBERT model model = models.ColBERT( model_name_or_path=pylate_model_id, ) # Step 2: Initialize the Voyager index index = indexes.Voyager( index_folder="pylate-index", index_name="index", override=True, # This overwrites the existing index if any ) # Step 3: Encode the documents documents_ids = ["1", "2", "3"] documents = ["document 1 text", "document 2 text", "document 3 text"] documents_embeddings = model.encode( documents, batch_size=32, is_query=False, # Ensure that it is set to False to indicate that these are documents, not queries show_progress_bar=True, ) # Step 4: Add document embeddings to the index by providing embeddings and corresponding ids index.add_documents( documents_ids=documents_ids, documents_embeddings=documents_embeddings, ) ``` Note that you do not have to recreate the index and encode the documents every time. Once you have created an index and added the documents, you can re-use the index later by loading it: ```python # To load an index, simply instantiate it with the correct folder/name and without overriding it index = indexes.Voyager( index_folder="pylate-index", index_name="index", ) ``` #### Retrieving top-k documents for queries Once the documents are indexed, you can retrieve the top-k most relevant documents for a given set of queries. To do so, initialize the ColBERT retriever with the index you want to search in, encode the queries and then retrieve the top-k documents to get the top matches ids and relevance scores: ```python # Step 1: Initialize the ColBERT retriever retriever = retrieve.ColBERT(index=index) # Step 2: Encode the queries queries_embeddings = model.encode( ["query for document 3", "query for document 1"], batch_size=32, is_query=True, # # Ensure that it is set to False to indicate that these are queries show_progress_bar=True, ) # Step 3: Retrieve top-k documents scores = retriever.retrieve( queries_embeddings=queries_embeddings, k=10, # Retrieve the top 10 matches for each query ) ``` ### Reranking If you only want to use the ColBERT model to perform reranking on top of your first-stage retrieval pipeline without building an index, you can simply use rank function and pass the queries and documents to rerank: ```python from pylate import rank, models queries = [ "query A", "query B", ] documents = [ ["document A", "document B"], ["document 1", "document C", "document B"], ] documents_ids = [ [1, 2], [1, 3, 2], ] model = models.ColBERT( model_name_or_path=pylate_model_id, ) queries_embeddings = model.encode( queries, is_query=True, ) documents_embeddings = model.encode( documents, is_query=False, ) reranked_documents = rank.rerank( documents_ids=documents_ids, queries_embeddings=queries_embeddings, documents_embeddings=documents_embeddings, ) ``` ## Evaluation ### Metrics #### Py Late Information Retrieval * Dataset: `['NanoClimateFEVER', 'NanoDBPedia', 'NanoFEVER', 'NanoFiQA2018', 'NanoHotpotQA', 'NanoMSMARCO', 'NanoNFCorpus', 'NanoNQ', 'NanoQuoraRetrieval', 'NanoSCIDOCS', 'NanoArguAna', 'NanoSciFact', 'NanoTouche2020']` * Evaluated with pylate.evaluation.pylate_information_retrieval_evaluator.PyLateInformationRetrievalEvaluator | Metric | NanoClimateFEVER | NanoDBPedia | NanoFEVER | NanoFiQA2018 | NanoHotpotQA | NanoMSMARCO | NanoNFCorpus | NanoNQ | NanoQuoraRetrieval | NanoSCIDOCS | NanoArguAna | NanoSciFact | NanoTouche2020 | |:--------------------|:-----------------|:------------|:-----------|:-------------|:-------------|:------------|:-------------|:-----------|:-------------------|:------------|:------------|:------------|:---------------| | MaxSim_accuracy@1 | 0.26 | 0.72 | 0.84 | 0.42 | 0.92 | 0.5 | 0.4 | 0.54 | 0.82 | 0.42 | 0.2 | 0.6 | 0.6939 | | MaxSim_accuracy@3 | 0.44 | 0.86 | 0.94 | 0.6 | 0.98 | 0.66 | 0.52 | 0.76 | 0.96 | 0.6 | 0.48 | 0.76 | 0.9184 | | MaxSim_accuracy@5 | 0.48 | 0.9 | 0.96 | 0.72 | 1.0 | 0.68 | 0.58 | 0.8 | 1.0 | 0.66 | 0.64 | 0.82 | 0.9592 | | MaxSim_accuracy@10 | 0.72 | 0.94 | 0.98 | 0.76 | 1.0 | 0.78 | 0.68 | 0.84 | 1.0 | 0.74 | 0.76 | 0.88 | 1.0 | | MaxSim_precision@1 | 0.26 | 0.72 | 0.84 | 0.42 | 0.92 | 0.5 | 0.4 | 0.54 | 0.82 | 0.42 | 0.2 | 0.6 | 0.6939 | | MaxSim_precision@3 | 0.1733 | 0.6067 | 0.3333 | 0.2667 | 0.54 | 0.22 | 0.34 | 0.2533 | 0.3867 | 0.2867 | 0.16 | 0.2667 | 0.6395 | | MaxSim_precision@5 | 0.12 | 0.572 | 0.208 | 0.228 | 0.34 | 0.136 | 0.32 | 0.168 | 0.244 | 0.224 | 0.128 | 0.18 | 0.6367 | | MaxSim_precision@10 | 0.102 | 0.49 | 0.106 | 0.14 | 0.18 | 0.078 | 0.28 | 0.09 | 0.128 | 0.15 | 0.076 | 0.098 | 0.4939 | | MaxSim_recall@1 | 0.12 | 0.0812 | 0.7767 | 0.2159 | 0.46 | 0.5 | 0.0335 | 0.51 | 0.734 | 0.0867 | 0.2 | 0.575 | 0.0494 | | MaxSim_recall@3 | 0.23 | 0.164 | 0.9033 | 0.3558 | 0.81 | 0.66 | 0.0771 | 0.7 | 0.912 | 0.1767 | 0.48 | 0.74 | 0.1304 | | MaxSim_recall@5 | 0.2567 | 0.2287 | 0.93 | 0.4973 | 0.85 | 0.68 | 0.0957 | 0.76 | 0.956 | 0.2287 | 0.64 | 0.815 | 0.2136 | | MaxSim_recall@10 | 0.4 | 0.3339 | 0.95 | 0.584 | 0.9 | 0.78 | 0.144 | 0.81 | 0.9727 | 0.3067 | 0.76 | 0.87 | 0.3156 | | **MaxSim_ndcg@10** | **0.3059** | **0.5944** | **0.8864** | **0.4729** | **0.8634** | **0.6351** | **0.3302** | **0.6699** | **0.9086** | **0.3142** | **0.4545** | **0.7357** | **0.5613** | | MaxSim_mrr@10 | 0.3818 | 0.7956 | 0.8903 | 0.5344 | 0.9507 | 0.5894 | 0.4766 | 0.6447 | 0.9 | 0.5304 | 0.3582 | 0.6952 | 0.8086 | | MaxSim_map@100 | 0.2305 | 0.4761 | 0.8591 | 0.3849 | 0.8069 | 0.6008 | 0.1419 | 0.6201 | 0.8814 | 0.2477 | 0.3653 | 0.6912 | 0.4155 | #### Nano BEIR * Dataset: `NanoBEIR_mean` * Evaluated with pylate.evaluation.nano_beir_evaluator.NanoBEIREvaluator | Metric | Value | |:--------------------|:-----------| | MaxSim_accuracy@1 | 0.5641 | | MaxSim_accuracy@3 | 0.7291 | | MaxSim_accuracy@5 | 0.7846 | | MaxSim_accuracy@10 | 0.8523 | | MaxSim_precision@1 | 0.5641 | | MaxSim_precision@3 | 0.3441 | | MaxSim_precision@5 | 0.2696 | | MaxSim_precision@10 | 0.1855 | | MaxSim_recall@1 | 0.334 | | MaxSim_recall@3 | 0.4876 | | MaxSim_recall@5 | 0.5501 | | MaxSim_recall@10 | 0.6251 | | **MaxSim_ndcg@10** | **0.5948** | | MaxSim_mrr@10 | 0.6581 | | MaxSim_map@100 | 0.517 | ## Training Details ### Training Dataset #### ms-marco-en-bge-gemma * Dataset: [ms-marco-en-bge-gemma](https://huggingface.co/datasets/lightonai/ms-marco-en-bge-gemma) at [d8bad49](https://huggingface.co/datasets/lightonai/ms-marco-en-bge-gemma/tree/d8bad497c8bd698c868a49721999c386d5e6ae8f) * Size: 533,177 training samples * Columns: query_id, document_ids, and scores * Approximate statistics based on the first 1000 samples: | | query_id | document_ids | scores | |:--------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------------------------|:------------------------------------| | type | int | list | list | | details | | | | * Samples: | query_id | document_ids | scores | |:--------------------|:----------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------| | 685613 | [7546874, 1176459, 197677, 2306318, 8541504, ...] | [0.9999999992804947, 0.24845418756716053, 0.7594154013647826, 0.26644182105618575, 0.390668914839766, ...] | | 237784 | [6366584, 4034101, 2325374, 6914618, 6042146, ...] | [0.9999999991784339, 0.42233632827946693, 0.5956354295491569, 0.12644415907455164, 0.6636713730105909, ...] | | 904294 | [448408, 8743975, 49600, 7339401, 2714261, ...] | [0.9999999991841937, 0.877629062381539, 0.8330146583389045, 0.3116634796692611, 0.4633524534142185, ...] | * Loss: pylate.losses.distillation.Distillation ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: steps - `per_device_train_batch_size`: 16 - `learning_rate`: 3e-05 - `num_train_epochs`: 1 - `bf16`: True #### All Hyperparameters
Click to expand - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: steps - `prediction_loss_only`: True - `per_device_train_batch_size`: 16 - `per_device_eval_batch_size`: 8 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `torch_empty_cache_steps`: None - `learning_rate`: 3e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 1 - `max_steps`: -1 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.0 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: True - `fp16`: False - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: False - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: None - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `include_for_metrics`: [] - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `use_liger_kernel`: False - `eval_use_gather_object`: False - `average_tokens_across_devices`: False - `prompts`: None - `batch_sampler`: batch_sampler - `multi_dataset_batch_sampler`: proportional
### Training Logs
Click to expand | Epoch | Step | Training Loss | NanoClimateFEVER_MaxSim_ndcg@10 | NanoDBPedia_MaxSim_ndcg@10 | NanoFEVER_MaxSim_ndcg@10 | NanoFiQA2018_MaxSim_ndcg@10 | NanoHotpotQA_MaxSim_ndcg@10 | NanoMSMARCO_MaxSim_ndcg@10 | NanoNFCorpus_MaxSim_ndcg@10 | NanoNQ_MaxSim_ndcg@10 | NanoQuoraRetrieval_MaxSim_ndcg@10 | NanoSCIDOCS_MaxSim_ndcg@10 | NanoArguAna_MaxSim_ndcg@10 | NanoSciFact_MaxSim_ndcg@10 | NanoTouche2020_MaxSim_ndcg@10 | NanoBEIR_mean_MaxSim_ndcg@10 | |:------:|:-----:|:-------------:|:-------------------------------:|:--------------------------:|:------------------------:|:---------------------------:|:---------------------------:|:--------------------------:|:---------------------------:|:---------------------:|:---------------------------------:|:--------------------------:|:--------------------------:|:--------------------------:|:-----------------------------:|:----------------------------:| | 0.0030 | 100 | 0.0366 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0060 | 200 | 0.0325 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0090 | 300 | 0.0308 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0120 | 400 | 0.0277 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0150 | 500 | 0.0268 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0180 | 600 | 0.0264 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0210 | 700 | 0.0254 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0240 | 800 | 0.0247 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0270 | 900 | 0.0246 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0300 | 1000 | 0.0244 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0330 | 1100 | 0.0242 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0360 | 1200 | 0.023 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0390 | 1300 | 0.0233 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0420 | 1400 | 0.0224 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0450 | 1500 | 0.0233 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0480 | 1600 | 0.0221 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0510 | 1700 | 0.0222 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0540 | 1800 | 0.0216 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0570 | 1900 | 0.0215 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0600 | 2000 | 0.0211 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0630 | 2100 | 0.021 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0660 | 2200 | 0.0208 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0690 | 2300 | 0.0205 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0720 | 2400 | 0.0207 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0750 | 2500 | 0.0204 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0780 | 2600 | 0.0201 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0810 | 2700 | 0.0199 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0840 | 2800 | 0.0197 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0870 | 2900 | 0.0201 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0900 | 3000 | 0.0187 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0930 | 3100 | 0.0198 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0960 | 3200 | 0.0192 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0990 | 3300 | 0.0194 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1020 | 3400 | 0.0188 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1050 | 3500 | 0.0193 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1080 | 3600 | 0.0193 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1110 | 3700 | 0.0186 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1140 | 3800 | 0.0187 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1170 | 3900 | 0.0186 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1200 | 4000 | 0.0186 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1230 | 4100 | 0.0181 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1260 | 4200 | 0.0181 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1290 | 4300 | 0.0182 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1320 | 4400 | 0.0184 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1350 | 4500 | 0.0178 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1380 | 4600 | 0.017 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1410 | 4700 | 0.0175 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1440 | 4800 | 0.0174 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1470 | 4900 | 0.0176 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1500 | 5000 | 0.0177 | 0.2830 | 0.5426 | 0.8641 | 0.4244 | 0.8531 | 0.6244 | 0.3106 | 0.5798 | 0.9230 | 0.3109 | 0.3815 | 0.7461 | 0.5805 | 0.5711 | | 0.1530 | 5100 | 0.0172 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1560 | 5200 | 0.017 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1590 | 5300 | 0.0173 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1620 | 5400 | 0.0169 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1650 | 5500 | 0.017 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1680 | 5600 | 0.0168 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1710 | 5700 | 0.0171 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1740 | 5800 | 0.0165 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1770 | 5900 | 0.0165 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1801 | 6000 | 0.0169 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1831 | 6100 | 0.0167 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1861 | 6200 | 0.0169 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1891 | 6300 | 0.0163 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1921 | 6400 | 0.0165 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1951 | 6500 | 0.0163 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1981 | 6600 | 0.0166 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2011 | 6700 | 0.0163 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2041 | 6800 | 0.0162 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2071 | 6900 | 0.0163 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2101 | 7000 | 0.016 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2131 | 7100 | 0.0164 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2161 | 7200 | 0.0162 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2191 | 7300 | 0.0158 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2221 | 7400 | 0.0156 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2251 | 7500 | 0.0154 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2281 | 7600 | 0.016 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2311 | 7700 | 0.0157 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2341 | 7800 | 0.0161 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2371 | 7900 | 0.0157 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2401 | 8000 | 0.0156 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2431 | 8100 | 0.0156 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2461 | 8200 | 0.0156 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2491 | 8300 | 0.0154 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2521 | 8400 | 0.0153 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2551 | 8500 | 0.0157 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2581 | 8600 | 0.0155 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2611 | 8700 | 0.0156 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2641 | 8800 | 0.0155 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2671 | 8900 | 0.0154 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2701 | 9000 | 0.0152 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2731 | 9100 | 0.0147 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2761 | 9200 | 0.0154 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2791 | 9300 | 0.0149 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2821 | 9400 | 0.0149 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2851 | 9500 | 0.0153 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2881 | 9600 | 0.0146 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2911 | 9700 | 0.0147 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2941 | 9800 | 0.015 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2971 | 9900 | 0.0149 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3001 | 10000 | 0.0146 | 0.2674 | 0.5466 | 0.8739 | 0.4547 | 0.8499 | 0.5933 | 0.3170 | 0.6256 | 0.9321 | 0.3137 | 0.3855 | 0.7387 | 0.5768 | 0.5750 | | 0.3031 | 10100 | 0.0147 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3061 | 10200 | 0.0147 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3091 | 10300 | 0.0147 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3121 | 10400 | 0.0148 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3151 | 10500 | 0.0147 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3181 | 10600 | 0.0143 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3211 | 10700 | 0.0144 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3241 | 10800 | 0.0145 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3271 | 10900 | 0.0148 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3301 | 11000 | 0.015 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3331 | 11100 | 0.0147 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3361 | 11200 | 0.0148 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3391 | 11300 | 0.0145 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3421 | 11400 | 0.014 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3451 | 11500 | 0.0146 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3481 | 11600 | 0.0143 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3511 | 11700 | 0.0142 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3541 | 11800 | 0.0141 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3571 | 11900 | 0.0139 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3601 | 12000 | 0.0141 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3631 | 12100 | 0.0142 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3661 | 12200 | 0.0143 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3691 | 12300 | 0.0138 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3721 | 12400 | 0.0137 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3751 | 12500 | 0.0139 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3781 | 12600 | 0.0142 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3811 | 12700 | 0.0136 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3841 | 12800 | 0.0138 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3871 | 12900 | 0.014 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3901 | 13000 | 0.0137 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3931 | 13100 | 0.014 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3961 | 13200 | 0.0141 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3991 | 13300 | 0.014 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4021 | 13400 | 0.0139 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4051 | 13500 | 0.0137 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4081 | 13600 | 0.0138 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4111 | 13700 | 0.0137 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4141 | 13800 | 0.0136 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4171 | 13900 | 0.0138 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4201 | 14000 | 0.0139 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4231 | 14100 | 0.0137 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4261 | 14200 | 0.0137 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4291 | 14300 | 0.0138 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4321 | 14400 | 0.0135 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4351 | 14500 | 0.0133 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4381 | 14600 | 0.0132 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4411 | 14700 | 0.0134 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4441 | 14800 | 0.0132 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4471 | 14900 | 0.0131 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4501 | 15000 | 0.0131 | 0.2885 | 0.5666 | 0.8718 | 0.4695 | 0.8453 | 0.6405 | 0.3128 | 0.6500 | 0.9257 | 0.3081 | 0.3923 | 0.7361 | 0.5852 | 0.5840 | | 0.4531 | 15100 | 0.0129 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4561 | 15200 | 0.0137 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4591 | 15300 | 0.0134 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4621 | 15400 | 0.0132 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4651 | 15500 | 0.0131 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4681 | 15600 | 0.0131 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4711 | 15700 | 0.0134 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4741 | 15800 | 0.0127 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4771 | 15900 | 0.0136 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4801 | 16000 | 0.0129 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4831 | 16100 | 0.0129 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4861 | 16200 | 0.0131 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4891 | 16300 | 0.0131 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4921 | 16400 | 0.013 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4951 | 16500 | 0.0128 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4981 | 16600 | 0.0129 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5011 | 16700 | 0.0131 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5041 | 16800 | 0.0131 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5071 | 16900 | 0.0129 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5101 | 17000 | 0.0127 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5131 | 17100 | 0.013 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5161 | 17200 | 0.0128 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5191 | 17300 | 0.013 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5221 | 17400 | 0.0129 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5251 | 17500 | 0.0129 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5281 | 17600 | 0.0128 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5311 | 17700 | 0.0132 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5341 | 17800 | 0.0127 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5372 | 17900 | 0.0128 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5402 | 18000 | 0.0129 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5432 | 18100 | 0.0123 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5462 | 18200 | 0.0126 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5492 | 18300 | 0.0127 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5522 | 18400 | 0.0129 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5552 | 18500 | 0.0127 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5582 | 18600 | 0.0125 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5612 | 18700 | 0.0125 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5642 | 18800 | 0.0129 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5672 | 18900 | 0.0127 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5702 | 19000 | 0.0122 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5732 | 19100 | 0.0124 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5762 | 19200 | 0.0127 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5792 | 19300 | 0.0127 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5822 | 19400 | 0.0125 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5852 | 19500 | 0.0124 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5882 | 19600 | 0.0122 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5912 | 19700 | 0.0123 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5942 | 19800 | 0.0127 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5972 | 19900 | 0.0123 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6002 | 20000 | 0.0124 | 0.2807 | 0.5738 | 0.8748 | 0.4586 | 0.8533 | 0.6174 | 0.3227 | 0.6215 | 0.9219 | 0.3104 | 0.4132 | 0.7348 | 0.5696 | 0.5810 | | 0.6032 | 20100 | 0.0126 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6062 | 20200 | 0.0122 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6092 | 20300 | 0.0126 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6122 | 20400 | 0.0124 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6152 | 20500 | 0.0123 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6182 | 20600 | 0.0125 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6212 | 20700 | 0.0126 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6242 | 20800 | 0.0125 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6272 | 20900 | 0.0125 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6302 | 21000 | 0.0125 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6332 | 21100 | 0.012 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6362 | 21200 | 0.012 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6392 | 21300 | 0.0123 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6422 | 21400 | 0.0122 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6452 | 21500 | 0.0123 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6482 | 21600 | 0.0121 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6512 | 21700 | 0.0122 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6542 | 21800 | 0.0121 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6572 | 21900 | 0.012 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6602 | 22000 | 0.0123 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6632 | 22100 | 0.0122 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6662 | 22200 | 0.0121 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6692 | 22300 | 0.0126 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6722 | 22400 | 0.0123 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6752 | 22500 | 0.0119 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6782 | 22600 | 0.0125 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6812 | 22700 | 0.0127 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6842 | 22800 | 0.0122 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6872 | 22900 | 0.0122 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6902 | 23000 | 0.0118 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6932 | 23100 | 0.0119 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6962 | 23200 | 0.0126 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6992 | 23300 | 0.0122 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7022 | 23400 | 0.0118 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7052 | 23500 | 0.0124 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7082 | 23600 | 0.0119 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7112 | 23700 | 0.0122 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7142 | 23800 | 0.0123 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7172 | 23900 | 0.0118 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7202 | 24000 | 0.0118 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7232 | 24100 | 0.0119 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7262 | 24200 | 0.0119 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7292 | 24300 | 0.0121 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7322 | 24400 | 0.0117 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7352 | 24500 | 0.0119 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7382 | 24600 | 0.0119 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7412 | 24700 | 0.0121 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7442 | 24800 | 0.0122 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7472 | 24900 | 0.0118 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7502 | 25000 | 0.0118 | 0.2940 | 0.5881 | 0.8842 | 0.4741 | 0.8653 | 0.6288 | 0.3414 | 0.6734 | 0.9000 | 0.3151 | 0.4361 | 0.7387 | 0.5610 | 0.5923 | | 0.7532 | 25100 | 0.012 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7562 | 25200 | 0.0115 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7592 | 25300 | 0.0118 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7622 | 25400 | 0.0116 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7652 | 25500 | 0.012 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7682 | 25600 | 0.0119 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7712 | 25700 | 0.0119 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7742 | 25800 | 0.0113 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7772 | 25900 | 0.0118 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7802 | 26000 | 0.0119 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7832 | 26100 | 0.0119 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7862 | 26200 | 0.0115 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7892 | 26300 | 0.0118 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7922 | 26400 | 0.0123 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7952 | 26500 | 0.0116 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7982 | 26600 | 0.0113 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8012 | 26700 | 0.0117 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8042 | 26800 | 0.0119 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8072 | 26900 | 0.012 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8102 | 27000 | 0.0116 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8132 | 27100 | 0.0119 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8162 | 27200 | 0.0119 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8192 | 27300 | 0.012 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8222 | 27400 | 0.012 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8252 | 27500 | 0.0117 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8282 | 27600 | 0.0116 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8312 | 27700 | 0.0119 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8342 | 27800 | 0.0112 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8372 | 27900 | 0.0117 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8402 | 28000 | 0.0115 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8432 | 28100 | 0.0115 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8462 | 28200 | 0.0115 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8492 | 28300 | 0.0116 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8522 | 28400 | 0.0115 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8552 | 28500 | 0.0119 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8582 | 28600 | 0.0117 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8612 | 28700 | 0.0117 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8642 | 28800 | 0.0115 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8672 | 28900 | 0.0119 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8702 | 29000 | 0.012 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8732 | 29100 | 0.0117 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8762 | 29200 | 0.0119 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8792 | 29300 | 0.0113 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8822 | 29400 | 0.0114 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8852 | 29500 | 0.0119 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8882 | 29600 | 0.0118 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8912 | 29700 | 0.0115 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8943 | 29800 | 0.0115 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8973 | 29900 | 0.0117 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9003 | 30000 | 0.0113 | 0.3059 | 0.5944 | 0.8864 | 0.4729 | 0.8634 | 0.6351 | 0.3302 | 0.6699 | 0.9086 | 0.3142 | 0.4545 | 0.7357 | 0.5613 | 0.5948 |
### Framework Versions - Python: 3.11.13 - Sentence Transformers: 4.0.2 - PyLate: 1.2.0 - Transformers: 4.48.2 - PyTorch: 2.6.0+cu124 - Accelerate: 1.10.0 - Datasets: 4.0.0 - Tokenizers: 0.21.4 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084" } ``` #### PyLate ```bibtex @misc{PyLate, title={PyLate: Flexible Training and Retrieval for Late Interaction Models}, author={Chaffin, Antoine and Sourty, Raphaƫl}, url={https://github.com/lightonai/pylate}, year={2024} } ```