PyLate model based on Speedsy/turkish-multilingual-e5-small-32768

This is a PyLate model finetuned from Speedsy/turkish-multilingual-e5-small-32768 on the train dataset. It maps sentences & paragraphs to sequences of 128-dimensional dense vectors and can be used for semantic textual similarity using the MaxSim operator.

Model Details

Model Description

Model Sources

Full Model Architecture

ColBERT(
  (0): Transformer({'max_seq_length': 179, 'do_lower_case': False}) with Transformer model: BertModel 
  (1): Dense({'in_features': 384, 'out_features': 128, 'bias': False, 'activation_function': 'torch.nn.modules.linear.Identity'})
)

Usage

First install the PyLate library:

pip install -U pylate

Retrieval

PyLate provides a streamlined interface to index and retrieve documents using ColBERT models. The index leverages the Voyager HNSW index to efficiently handle document embeddings and enable fast retrieval.

Indexing documents

First, load the ColBERT model and initialize the Voyager index, then encode and index your documents:

from pylate import indexes, models, retrieve

# Step 1: Load the ColBERT model
model = models.ColBERT(
    model_name_or_path=pylate_model_id,
)

# Step 2: Initialize the Voyager index
index = indexes.Voyager(
    index_folder="pylate-index",
    index_name="index",
    override=True,  # This overwrites the existing index if any
)

# Step 3: Encode the documents
documents_ids = ["1", "2", "3"]
documents = ["document 1 text", "document 2 text", "document 3 text"]

documents_embeddings = model.encode(
    documents,
    batch_size=32,
    is_query=False,  # Ensure that it is set to False to indicate that these are documents, not queries
    show_progress_bar=True,
)

# Step 4: Add document embeddings to the index by providing embeddings and corresponding ids
index.add_documents(
    documents_ids=documents_ids,
    documents_embeddings=documents_embeddings,
)

Note that you do not have to recreate the index and encode the documents every time. Once you have created an index and added the documents, you can re-use the index later by loading it:

# To load an index, simply instantiate it with the correct folder/name and without overriding it
index = indexes.Voyager(
    index_folder="pylate-index",
    index_name="index",
)

Retrieving top-k documents for queries

Once the documents are indexed, you can retrieve the top-k most relevant documents for a given set of queries. To do so, initialize the ColBERT retriever with the index you want to search in, encode the queries and then retrieve the top-k documents to get the top matches ids and relevance scores:

# Step 1: Initialize the ColBERT retriever
retriever = retrieve.ColBERT(index=index)

# Step 2: Encode the queries
queries_embeddings = model.encode(
    ["query for document 3", "query for document 1"],
    batch_size=32,
    is_query=True,  #  # Ensure that it is set to False to indicate that these are queries
    show_progress_bar=True,
)

# Step 3: Retrieve top-k documents
scores = retriever.retrieve(
    queries_embeddings=queries_embeddings,
    k=10,  # Retrieve the top 10 matches for each query
)

Reranking

If you only want to use the ColBERT model to perform reranking on top of your first-stage retrieval pipeline without building an index, you can simply use rank function and pass the queries and documents to rerank:

from pylate import rank, models

queries = [
    "query A",
    "query B",
]

documents = [
    ["document A", "document B"],
    ["document 1", "document C", "document B"],
]

documents_ids = [
    [1, 2],
    [1, 3, 2],
]

model = models.ColBERT(
    model_name_or_path=pylate_model_id,
)

queries_embeddings = model.encode(
    queries,
    is_query=True,
)

documents_embeddings = model.encode(
    documents,
    is_query=False,
)

reranked_documents = rank.rerank(
    documents_ids=documents_ids,
    queries_embeddings=queries_embeddings,
    documents_embeddings=documents_embeddings,
)

Evaluation

Metrics

Py Late Information Retrieval

  • Dataset: ['NanoDBPedia', 'NanoFiQA2018', 'NanoHotpotQA', 'NanoMSMARCO', 'NanoNQ', 'NanoSCIDOCS']
  • Evaluated with pylate.evaluation.pylate_information_retrieval_evaluator.PyLateInformationRetrievalEvaluator
Metric NanoDBPedia NanoFiQA2018 NanoHotpotQA NanoMSMARCO NanoNQ NanoSCIDOCS
MaxSim_accuracy@1 0.82 0.36 0.82 0.34 0.56 0.34
MaxSim_accuracy@3 0.94 0.5 0.96 0.58 0.7 0.6
MaxSim_accuracy@5 0.94 0.58 0.96 0.64 0.76 0.68
MaxSim_accuracy@10 0.98 0.68 0.96 0.7 0.82 0.78
MaxSim_precision@1 0.82 0.36 0.82 0.34 0.56 0.34
MaxSim_precision@3 0.64 0.22 0.4933 0.1933 0.2333 0.2667
MaxSim_precision@5 0.572 0.172 0.32 0.128 0.156 0.22
MaxSim_precision@10 0.508 0.11 0.162 0.07 0.088 0.15
MaxSim_recall@1 0.1111 0.1934 0.41 0.34 0.52 0.0717
MaxSim_recall@3 0.1968 0.3188 0.74 0.58 0.65 0.1657
MaxSim_recall@5 0.2556 0.3966 0.8 0.64 0.72 0.2257
MaxSim_recall@10 0.3626 0.5055 0.81 0.7 0.78 0.3067
MaxSim_ndcg@10 0.649 0.3983 0.7778 0.5157 0.6601 0.298
MaxSim_mrr@10 0.8792 0.4519 0.88 0.4562 0.6449 0.491
MaxSim_map@100 0.5154 0.3274 0.7204 0.4692 0.6141 0.2258

Pylate Custom Nano BEIR

  • Dataset: NanoBEIR_mean
  • Evaluated with pylate_nano_beir_evaluator.PylateCustomNanoBEIREvaluator
Metric Value
MaxSim_accuracy@1 0.54
MaxSim_accuracy@3 0.7133
MaxSim_accuracy@5 0.76
MaxSim_accuracy@10 0.82
MaxSim_precision@1 0.54
MaxSim_precision@3 0.3411
MaxSim_precision@5 0.2613
MaxSim_precision@10 0.1813
MaxSim_recall@1 0.2744
MaxSim_recall@3 0.4419
MaxSim_recall@5 0.5063
MaxSim_recall@10 0.5775
MaxSim_ndcg@10 0.5498
MaxSim_mrr@10 0.6339
MaxSim_map@100 0.4787

Training Details

Training Dataset

train

  • Dataset: train at 1072b6b
  • Size: 443,147 training samples
  • Columns: query_id, document_ids, and scores
  • Approximate statistics based on the first 1000 samples:
    query_id document_ids scores
    type string list list
    details
    • min: 5 tokens
    • mean: 5.83 tokens
    • max: 6 tokens
    • size: 32 elements
    • size: 32 elements
  • Samples:
    query_id document_ids scores
    817836 ['2716076', '6741935', '2681109', '5562684', '3507339', ...] [1.0, 0.7059561610221863, 0.21702419221401215, 0.38270196318626404, 0.20812414586544037, ...]
    1045170 ['5088671', '2953295', '8783471', '4268439', '6339935', ...] [1.0, 0.6493034362792969, 0.0692221149802208, 0.17963139712810516, 0.6697239875793457, ...]
    1069432 ['3724008', '314949', '8657336', '7420456', '879004', ...] [1.0, 0.3706032931804657, 0.3508036434650421, 0.2823200523853302, 0.17563475668430328, ...]
  • Loss: pylate.losses.distillation.Distillation

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 32
  • learning_rate: 3e-05
  • num_train_epochs: 1
  • bf16: True

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 32
  • per_device_eval_batch_size: 8
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 3e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 1
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.0
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: True
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss NanoDBPedia_MaxSim_ndcg@10 NanoFiQA2018_MaxSim_ndcg@10 NanoHotpotQA_MaxSim_ndcg@10 NanoMSMARCO_MaxSim_ndcg@10 NanoNQ_MaxSim_ndcg@10 NanoSCIDOCS_MaxSim_ndcg@10 NanoBEIR_mean_MaxSim_ndcg@10
0.0014 20 0.0332 - - - - - - -
0.0029 40 0.0297 - - - - - - -
0.0043 60 0.0285 - - - - - - -
0.0058 80 0.0289 - - - - - - -
0.0072 100 0.0278 - - - - - - -
0.0087 120 0.0271 - - - - - - -
0.0101 140 0.0267 - - - - - - -
0.0116 160 0.0269 - - - - - - -
0.0130 180 0.0264 - - - - - - -
0.0144 200 0.0262 - - - - - - -
0.0159 220 0.0259 - - - - - - -
0.0173 240 0.026 - - - - - - -
0.0188 260 0.0251 - - - - - - -
0.0202 280 0.025 - - - - - - -
0.0217 300 0.025 - - - - - - -
0.0231 320 0.0261 - - - - - - -
0.0246 340 0.0249 - - - - - - -
0.0260 360 0.0249 - - - - - - -
0.0274 380 0.0243 - - - - - - -
0.0289 400 0.0245 - - - - - - -
0.0303 420 0.0247 - - - - - - -
0.0318 440 0.0245 - - - - - - -
0.0332 460 0.0234 - - - - - - -
0.0347 480 0.0235 - - - - - - -
0.0361 500 0.0245 0.6352 0.3462 0.7674 0.5245 0.6220 0.2769 0.5287
0.0375 520 0.0244 - - - - - - -
0.0390 540 0.0233 - - - - - - -
0.0404 560 0.0239 - - - - - - -
0.0419 580 0.0232 - - - - - - -
0.0433 600 0.0225 - - - - - - -
0.0448 620 0.0234 - - - - - - -
0.0462 640 0.0245 - - - - - - -
0.0477 660 0.0229 - - - - - - -
0.0491 680 0.0232 - - - - - - -
0.0505 700 0.023 - - - - - - -
0.0520 720 0.0238 - - - - - - -
0.0534 740 0.0239 - - - - - - -
0.0549 760 0.0229 - - - - - - -
0.0563 780 0.0237 - - - - - - -
0.0578 800 0.0236 - - - - - - -
0.0592 820 0.0224 - - - - - - -
0.0607 840 0.0226 - - - - - - -
0.0621 860 0.0225 - - - - - - -
0.0635 880 0.023 - - - - - - -
0.0650 900 0.0232 - - - - - - -
0.0664 920 0.0224 - - - - - - -
0.0679 940 0.0227 - - - - - - -
0.0693 960 0.0231 - - - - - - -
0.0708 980 0.0238 - - - - - - -
0.0722 1000 0.0224 0.6463 0.3673 0.7878 0.5283 0.6466 0.2869 0.5439
0.0737 1020 0.0225 - - - - - - -
0.0751 1040 0.0223 - - - - - - -
0.0765 1060 0.023 - - - - - - -
0.0780 1080 0.0218 - - - - - - -
0.0794 1100 0.0228 - - - - - - -
0.0809 1120 0.0219 - - - - - - -
0.0823 1140 0.0225 - - - - - - -
0.0838 1160 0.0231 - - - - - - -
0.0852 1180 0.0233 - - - - - - -
0.0866 1200 0.0224 - - - - - - -
0.0881 1220 0.0223 - - - - - - -
0.0895 1240 0.0216 - - - - - - -
0.0910 1260 0.0227 - - - - - - -
0.0924 1280 0.0218 - - - - - - -
0.0939 1300 0.0222 - - - - - - -
0.0953 1320 0.0218 - - - - - - -
0.0968 1340 0.0216 - - - - - - -
0.0982 1360 0.0227 - - - - - - -
0.0996 1380 0.0211 - - - - - - -
0.1011 1400 0.022 - - - - - - -
0.1025 1420 0.0211 - - - - - - -
0.1040 1440 0.0219 - - - - - - -
0.1054 1460 0.023 - - - - - - -
0.1069 1480 0.0215 - - - - - - -
0.1083 1500 0.022 0.6323 0.3640 0.7771 0.4895 0.6553 0.2903 0.5348
0.1098 1520 0.0222 - - - - - - -
0.1112 1540 0.0222 - - - - - - -
0.1126 1560 0.0227 - - - - - - -
0.1141 1580 0.0225 - - - - - - -
0.1155 1600 0.0222 - - - - - - -
0.1170 1620 0.0217 - - - - - - -
0.1184 1640 0.0217 - - - - - - -
0.1199 1660 0.0224 - - - - - - -
0.1213 1680 0.0215 - - - - - - -
0.1228 1700 0.022 - - - - - - -
0.1242 1720 0.0222 - - - - - - -
0.1256 1740 0.0208 - - - - - - -
0.1271 1760 0.0224 - - - - - - -
0.1285 1780 0.0205 - - - - - - -
0.1300 1800 0.0214 - - - - - - -
0.1314 1820 0.0212 - - - - - - -
0.1329 1840 0.0207 - - - - - - -
0.1343 1860 0.0213 - - - - - - -
0.1357 1880 0.0211 - - - - - - -
0.1372 1900 0.0215 - - - - - - -
0.1386 1920 0.0218 - - - - - - -
0.1401 1940 0.0216 - - - - - - -
0.1415 1960 0.022 - - - - - - -
0.1430 1980 0.0222 - - - - - - -
0.1444 2000 0.0217 0.6472 0.3492 0.7873 0.5109 0.6687 0.3043 0.5446
0.1459 2020 0.022 - - - - - - -
0.1473 2040 0.0204 - - - - - - -
0.1487 2060 0.0215 - - - - - - -
0.1502 2080 0.0215 - - - - - - -
0.1516 2100 0.0217 - - - - - - -
0.1531 2120 0.0214 - - - - - - -
0.1545 2140 0.0217 - - - - - - -
0.1560 2160 0.022 - - - - - - -
0.1574 2180 0.0211 - - - - - - -
0.1589 2200 0.0212 - - - - - - -
0.1603 2220 0.0215 - - - - - - -
0.1617 2240 0.0212 - - - - - - -
0.1632 2260 0.0206 - - - - - - -
0.1646 2280 0.0213 - - - - - - -
0.1661 2300 0.0216 - - - - - - -
0.1675 2320 0.0219 - - - - - - -
0.1690 2340 0.0214 - - - - - - -
0.1704 2360 0.0206 - - - - - - -
0.1719 2380 0.0209 - - - - - - -
0.1733 2400 0.0216 - - - - - - -
0.1747 2420 0.0211 - - - - - - -
0.1762 2440 0.0198 - - - - - - -
0.1776 2460 0.0207 - - - - - - -
0.1791 2480 0.0218 - - - - - - -
0.1805 2500 0.0211 0.6445 0.3645 0.7612 0.5291 0.6565 0.2904 0.5411
0.1820 2520 0.0222 - - - - - - -
0.1834 2540 0.021 - - - - - - -
0.1849 2560 0.021 - - - - - - -
0.1863 2580 0.0213 - - - - - - -
0.1877 2600 0.0214 - - - - - - -
0.1892 2620 0.0216 - - - - - - -
0.1906 2640 0.0206 - - - - - - -
0.1921 2660 0.021 - - - - - - -
0.1935 2680 0.0213 - - - - - - -
0.1950 2700 0.0207 - - - - - - -
0.1964 2720 0.0214 - - - - - - -
0.1978 2740 0.0202 - - - - - - -
0.1993 2760 0.0201 - - - - - - -
0.2007 2780 0.0204 - - - - - - -
0.2022 2800 0.0207 - - - - - - -
0.2036 2820 0.0212 - - - - - - -
0.2051 2840 0.0205 - - - - - - -
0.2065 2860 0.0206 - - - - - - -
0.2080 2880 0.0205 - - - - - - -
0.2094 2900 0.0211 - - - - - - -
0.2108 2920 0.0209 - - - - - - -
0.2123 2940 0.0209 - - - - - - -
0.2137 2960 0.0213 - - - - - - -
0.2152 2980 0.0205 - - - - - - -
0.2166 3000 0.0201 0.6543 0.4016 0.7867 0.5219 0.6615 0.2656 0.5486
0.2181 3020 0.0221 - - - - - - -
0.2195 3040 0.0207 - - - - - - -
0.2210 3060 0.0208 - - - - - - -
0.2224 3080 0.0209 - - - - - - -
0.2238 3100 0.0209 - - - - - - -
0.2253 3120 0.0206 - - - - - - -
0.2267 3140 0.0203 - - - - - - -
0.2282 3160 0.0206 - - - - - - -
0.2296 3180 0.0207 - - - - - - -
0.2311 3200 0.0211 - - - - - - -
0.2325 3220 0.0213 - - - - - - -
0.2340 3240 0.0203 - - - - - - -
0.2354 3260 0.0205 - - - - - - -
0.2368 3280 0.0219 - - - - - - -
0.2383 3300 0.0197 - - - - - - -
0.2397 3320 0.0207 - - - - - - -
0.2412 3340 0.0205 - - - - - - -
0.2426 3360 0.0208 - - - - - - -
0.2441 3380 0.0201 - - - - - - -
0.2455 3400 0.0213 - - - - - - -
0.2469 3420 0.0207 - - - - - - -
0.2484 3440 0.02 - - - - - - -
0.2498 3460 0.0204 - - - - - - -
0.2513 3480 0.0201 - - - - - - -
0.2527 3500 0.0211 0.6481 0.3850 0.7743 0.5167 0.6401 0.2750 0.5399
0.2542 3520 0.021 - - - - - - -
0.2556 3540 0.0208 - - - - - - -
0.2571 3560 0.02 - - - - - - -
0.2585 3580 0.0211 - - - - - - -
0.2599 3600 0.0199 - - - - - - -
0.2614 3620 0.0192 - - - - - - -
0.2628 3640 0.0203 - - - - - - -
0.2643 3660 0.0197 - - - - - - -
0.2657 3680 0.0196 - - - - - - -
0.2672 3700 0.0198 - - - - - - -
0.2686 3720 0.0213 - - - - - - -
0.2701 3740 0.0199 - - - - - - -
0.2715 3760 0.0205 - - - - - - -
0.2729 3780 0.0205 - - - - - - -
0.2744 3800 0.0207 - - - - - - -
0.2758 3820 0.0204 - - - - - - -
0.2773 3840 0.0209 - - - - - - -
0.2787 3860 0.0211 - - - - - - -
0.2802 3880 0.0199 - - - - - - -
0.2816 3900 0.0212 - - - - - - -
0.2831 3920 0.0194 - - - - - - -
0.2845 3940 0.0196 - - - - - - -
0.2859 3960 0.0211 - - - - - - -
0.2874 3980 0.0198 - - - - - - -
0.2888 4000 0.0207 0.6402 0.3955 0.7813 0.4997 0.6360 0.2845 0.5395
0.2903 4020 0.0193 - - - - - - -
0.2917 4040 0.0198 - - - - - - -
0.2932 4060 0.0208 - - - - - - -
0.2946 4080 0.02 - - - - - - -
0.2961 4100 0.0202 - - - - - - -
0.2975 4120 0.0198 - - - - - - -
0.2989 4140 0.0193 - - - - - - -
0.3004 4160 0.0202 - - - - - - -
0.3018 4180 0.0198 - - - - - - -
0.3033 4200 0.0198 - - - - - - -
0.3047 4220 0.0197 - - - - - - -
0.3062 4240 0.0198 - - - - - - -
0.3076 4260 0.0191 - - - - - - -
0.3090 4280 0.019 - - - - - - -
0.3105 4300 0.0194 - - - - - - -
0.3119 4320 0.0207 - - - - - - -
0.3134 4340 0.019 - - - - - - -
0.3148 4360 0.0202 - - - - - - -
0.3163 4380 0.0202 - - - - - - -
0.3177 4400 0.0204 - - - - - - -
0.3192 4420 0.02 - - - - - - -
0.3206 4440 0.0198 - - - - - - -
0.3220 4460 0.0191 - - - - - - -
0.3235 4480 0.02 - - - - - - -
0.3249 4500 0.0199 0.6381 0.4037 0.7803 0.5196 0.6260 0.2848 0.5421
0.3264 4520 0.0209 - - - - - - -
0.3278 4540 0.0207 - - - - - - -
0.3293 4560 0.0204 - - - - - - -
0.3307 4580 0.0197 - - - - - - -
0.3322 4600 0.0198 - - - - - - -
0.3336 4620 0.0198 - - - - - - -
0.3350 4640 0.0194 - - - - - - -
0.3365 4660 0.0201 - - - - - - -
0.3379 4680 0.0197 - - - - - - -
0.3394 4700 0.0195 - - - - - - -
0.3408 4720 0.0187 - - - - - - -
0.3423 4740 0.0194 - - - - - - -
0.3437 4760 0.0192 - - - - - - -
0.3452 4780 0.0202 - - - - - - -
0.3466 4800 0.0191 - - - - - - -
0.3480 4820 0.0194 - - - - - - -
0.3495 4840 0.0205 - - - - - - -
0.3509 4860 0.019 - - - - - - -
0.3524 4880 0.0202 - - - - - - -
0.3538 4900 0.0191 - - - - - - -
0.3553 4920 0.0194 - - - - - - -
0.3567 4940 0.0192 - - - - - - -
0.3581 4960 0.0195 - - - - - - -
0.3596 4980 0.0197 - - - - - - -
0.3610 5000 0.0202 0.6362 0.3887 0.7957 0.5114 0.6366 0.2755 0.5407
0.3625 5020 0.0196 - - - - - - -
0.3639 5040 0.0203 - - - - - - -
0.3654 5060 0.0201 - - - - - - -
0.3668 5080 0.0193 - - - - - - -
0.3683 5100 0.019 - - - - - - -
0.3697 5120 0.0195 - - - - - - -
0.3711 5140 0.0197 - - - - - - -
0.3726 5160 0.0198 - - - - - - -
0.3740 5180 0.0198 - - - - - - -
0.3755 5200 0.0203 - - - - - - -
0.3769 5220 0.0192 - - - - - - -
0.3784 5240 0.0202 - - - - - - -
0.3798 5260 0.02 - - - - - - -
0.3813 5280 0.0198 - - - - - - -
0.3827 5300 0.0189 - - - - - - -
0.3841 5320 0.0206 - - - - - - -
0.3856 5340 0.0196 - - - - - - -
0.3870 5360 0.0194 - - - - - - -
0.3885 5380 0.0194 - - - - - - -
0.3899 5400 0.0197 - - - - - - -
0.3914 5420 0.0196 - - - - - - -
0.3928 5440 0.0203 - - - - - - -
0.3943 5460 0.0196 - - - - - - -
0.3957 5480 0.0206 - - - - - - -
0.3971 5500 0.0196 0.6268 0.4017 0.7928 0.5383 0.6415 0.2983 0.5499
0.3986 5520 0.0191 - - - - - - -
0.4000 5540 0.0194 - - - - - - -
0.4015 5560 0.0193 - - - - - - -
0.4029 5580 0.0197 - - - - - - -
0.4044 5600 0.0196 - - - - - - -
0.4058 5620 0.0194 - - - - - - -
0.4072 5640 0.0201 - - - - - - -
0.4087 5660 0.0199 - - - - - - -
0.4101 5680 0.0197 - - - - - - -
0.4116 5700 0.0189 - - - - - - -
0.4130 5720 0.0193 - - - - - - -
0.4145 5740 0.021 - - - - - - -
0.4159 5760 0.0199 - - - - - - -
0.4174 5780 0.0205 - - - - - - -
0.4188 5800 0.0195 - - - - - - -
0.4202 5820 0.0195 - - - - - - -
0.4217 5840 0.0185 - - - - - - -
0.4231 5860 0.0193 - - - - - - -
0.4246 5880 0.0196 - - - - - - -
0.4260 5900 0.0191 - - - - - - -
0.4275 5920 0.0195 - - - - - - -
0.4289 5940 0.0201 - - - - - - -
0.4304 5960 0.0196 - - - - - - -
0.4318 5980 0.0204 - - - - - - -
0.4332 6000 0.0186 0.6559 0.4073 0.8023 0.5411 0.6544 0.2962 0.5595
0.4347 6020 0.0184 - - - - - - -
0.4361 6040 0.0196 - - - - - - -
0.4376 6060 0.0185 - - - - - - -
0.4390 6080 0.0196 - - - - - - -
0.4405 6100 0.0197 - - - - - - -
0.4419 6120 0.0201 - - - - - - -
0.4434 6140 0.0195 - - - - - - -
0.4448 6160 0.0188 - - - - - - -
0.4462 6180 0.0192 - - - - - - -
0.4477 6200 0.0191 - - - - - - -
0.4491 6220 0.0191 - - - - - - -
0.4506 6240 0.0193 - - - - - - -
0.4520 6260 0.0195 - - - - - - -
0.4535 6280 0.0188 - - - - - - -
0.4549 6300 0.0198 - - - - - - -
0.4564 6320 0.0192 - - - - - - -
0.4578 6340 0.0193 - - - - - - -
0.4592 6360 0.0199 - - - - - - -
0.4607 6380 0.0194 - - - - - - -
0.4621 6400 0.0207 - - - - - - -
0.4636 6420 0.0193 - - - - - - -
0.4650 6440 0.0198 - - - - - - -
0.4665 6460 0.0185 - - - - - - -
0.4679 6480 0.0205 - - - - - - -
0.4693 6500 0.0194 0.6413 0.4048 0.7962 0.5413 0.6646 0.2982 0.5577
0.4708 6520 0.0185 - - - - - - -
0.4722 6540 0.0196 - - - - - - -
0.4737 6560 0.0191 - - - - - - -
0.4751 6580 0.019 - - - - - - -
0.4766 6600 0.0195 - - - - - - -
0.4780 6620 0.0195 - - - - - - -
0.4795 6640 0.0195 - - - - - - -
0.4809 6660 0.0193 - - - - - - -
0.4823 6680 0.0193 - - - - - - -
0.4838 6700 0.0195 - - - - - - -
0.4852 6720 0.0196 - - - - - - -
0.4867 6740 0.0177 - - - - - - -
0.4881 6760 0.0181 - - - - - - -
0.4896 6780 0.0195 - - - - - - -
0.4910 6800 0.0189 - - - - - - -
0.4925 6820 0.0195 - - - - - - -
0.4939 6840 0.0183 - - - - - - -
0.4953 6860 0.0201 - - - - - - -
0.4968 6880 0.0192 - - - - - - -
0.4982 6900 0.0191 - - - - - - -
0.4997 6920 0.0194 - - - - - - -
0.5011 6940 0.0189 - - - - - - -
0.5026 6960 0.0198 - - - - - - -
0.5040 6980 0.0185 - - - - - - -
0.5055 7000 0.0197 0.6441 0.3793 0.7954 0.5223 0.6622 0.3075 0.5518
0.5069 7020 0.0196 - - - - - - -
0.5083 7040 0.0195 - - - - - - -
0.5098 7060 0.0195 - - - - - - -
0.5112 7080 0.02 - - - - - - -
0.5127 7100 0.0195 - - - - - - -
0.5141 7120 0.0194 - - - - - - -
0.5156 7140 0.019 - - - - - - -
0.5170 7160 0.0201 - - - - - - -
0.5184 7180 0.0184 - - - - - - -
0.5199 7200 0.0188 - - - - - - -
0.5213 7220 0.0201 - - - - - - -
0.5228 7240 0.0182 - - - - - - -
0.5242 7260 0.0195 - - - - - - -
0.5257 7280 0.019 - - - - - - -
0.5271 7300 0.019 - - - - - - -
0.5286 7320 0.0185 - - - - - - -
0.5300 7340 0.0189 - - - - - - -
0.5314 7360 0.0188 - - - - - - -
0.5329 7380 0.0187 - - - - - - -
0.5343 7400 0.0179 - - - - - - -
0.5358 7420 0.0191 - - - - - - -
0.5372 7440 0.0187 - - - - - - -
0.5387 7460 0.0181 - - - - - - -
0.5401 7480 0.0191 - - - - - - -
0.5416 7500 0.0176 0.6409 0.3827 0.7882 0.5272 0.6503 0.2934 0.5471
0.5430 7520 0.0203 - - - - - - -
0.5444 7540 0.0184 - - - - - - -
0.5459 7560 0.019 - - - - - - -
0.5473 7580 0.019 - - - - - - -
0.5488 7600 0.0194 - - - - - - -
0.5502 7620 0.0187 - - - - - - -
0.5517 7640 0.0185 - - - - - - -
0.5531 7660 0.0194 - - - - - - -
0.5546 7680 0.0192 - - - - - - -
0.5560 7700 0.0191 - - - - - - -
0.5574 7720 0.0178 - - - - - - -
0.5589 7740 0.0181 - - - - - - -
0.5603 7760 0.0186 - - - - - - -
0.5618 7780 0.0184 - - - - - - -
0.5632 7800 0.0189 - - - - - - -
0.5647 7820 0.0189 - - - - - - -
0.5661 7840 0.0189 - - - - - - -
0.5676 7860 0.0186 - - - - - - -
0.5690 7880 0.018 - - - - - - -
0.5704 7900 0.0186 - - - - - - -
0.5719 7920 0.0187 - - - - - - -
0.5733 7940 0.0189 - - - - - - -
0.5748 7960 0.0198 - - - - - - -
0.5762 7980 0.0191 - - - - - - -
0.5777 8000 0.0177 0.6439 0.3972 0.7947 0.5342 0.6556 0.2936 0.5532
0.5791 8020 0.0197 - - - - - - -
0.5805 8040 0.0195 - - - - - - -
0.5820 8060 0.0185 - - - - - - -
0.5834 8080 0.0191 - - - - - - -
0.5849 8100 0.0187 - - - - - - -
0.5863 8120 0.0182 - - - - - - -
0.5878 8140 0.0181 - - - - - - -
0.5892 8160 0.019 - - - - - - -
0.5907 8180 0.0189 - - - - - - -
0.5921 8200 0.0197 - - - - - - -
0.5935 8220 0.0183 - - - - - - -
0.5950 8240 0.0191 - - - - - - -
0.5964 8260 0.0188 - - - - - - -
0.5979 8280 0.0195 - - - - - - -
0.5993 8300 0.0191 - - - - - - -
0.6008 8320 0.0185 - - - - - - -
0.6022 8340 0.0185 - - - - - - -
0.6037 8360 0.0186 - - - - - - -
0.6051 8380 0.0178 - - - - - - -
0.6065 8400 0.0182 - - - - - - -
0.6080 8420 0.0196 - - - - - - -
0.6094 8440 0.019 - - - - - - -
0.6109 8460 0.0198 - - - - - - -
0.6123 8480 0.0188 - - - - - - -
0.6138 8500 0.0192 0.6490 0.3983 0.7778 0.5157 0.6601 0.2980 0.5498

Framework Versions

  • Python: 3.11.12
  • Sentence Transformers: 4.0.2
  • PyLate: 1.2.0
  • Transformers: 4.48.2
  • PyTorch: 2.6.0+cu124
  • Accelerate: 1.6.0
  • Datasets: 3.6.0
  • Tokenizers: 0.21.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084"
}

PyLate

@misc{PyLate,
title={PyLate: Flexible Training and Retrieval for Late Interaction Models},
author={Chaffin, Antoine and Sourty, Raphaël},
url={https://github.com/lightonai/pylate},
year={2024}
}
Downloads last month
4
Safetensors
Model size
34.2M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Speedsy/turkish-multilingual-e5-small-32768-colbert-cleaned-data-32bsize-8500

Finetuned
(42)
this model

Evaluation results