PyLate model based on Speedsy/turkish-multilingual-e5-small-32768

This is a PyLate model finetuned from Speedsy/turkish-multilingual-e5-small-32768 on the train dataset. It maps sentences & paragraphs to sequences of 128-dimensional dense vectors and can be used for semantic textual similarity using the MaxSim operator.

Model Details

Model Description

Model Sources

Full Model Architecture

ColBERT(
  (0): Transformer({'max_seq_length': 179, 'do_lower_case': False}) with Transformer model: BertModel 
  (1): Dense({'in_features': 384, 'out_features': 128, 'bias': False, 'activation_function': 'torch.nn.modules.linear.Identity'})
)

Usage

First install the PyLate library:

pip install -U pylate

Retrieval

PyLate provides a streamlined interface to index and retrieve documents using ColBERT models. The index leverages the Voyager HNSW index to efficiently handle document embeddings and enable fast retrieval.

Indexing documents

First, load the ColBERT model and initialize the Voyager index, then encode and index your documents:

from pylate import indexes, models, retrieve

# Step 1: Load the ColBERT model
model = models.ColBERT(
    model_name_or_path=pylate_model_id,
)

# Step 2: Initialize the Voyager index
index = indexes.Voyager(
    index_folder="pylate-index",
    index_name="index",
    override=True,  # This overwrites the existing index if any
)

# Step 3: Encode the documents
documents_ids = ["1", "2", "3"]
documents = ["document 1 text", "document 2 text", "document 3 text"]

documents_embeddings = model.encode(
    documents,
    batch_size=32,
    is_query=False,  # Ensure that it is set to False to indicate that these are documents, not queries
    show_progress_bar=True,
)

# Step 4: Add document embeddings to the index by providing embeddings and corresponding ids
index.add_documents(
    documents_ids=documents_ids,
    documents_embeddings=documents_embeddings,
)

Note that you do not have to recreate the index and encode the documents every time. Once you have created an index and added the documents, you can re-use the index later by loading it:

# To load an index, simply instantiate it with the correct folder/name and without overriding it
index = indexes.Voyager(
    index_folder="pylate-index",
    index_name="index",
)

Retrieving top-k documents for queries

Once the documents are indexed, you can retrieve the top-k most relevant documents for a given set of queries. To do so, initialize the ColBERT retriever with the index you want to search in, encode the queries and then retrieve the top-k documents to get the top matches ids and relevance scores:

# Step 1: Initialize the ColBERT retriever
retriever = retrieve.ColBERT(index=index)

# Step 2: Encode the queries
queries_embeddings = model.encode(
    ["query for document 3", "query for document 1"],
    batch_size=32,
    is_query=True,  #  # Ensure that it is set to False to indicate that these are queries
    show_progress_bar=True,
)

# Step 3: Retrieve top-k documents
scores = retriever.retrieve(
    queries_embeddings=queries_embeddings,
    k=10,  # Retrieve the top 10 matches for each query
)

Reranking

If you only want to use the ColBERT model to perform reranking on top of your first-stage retrieval pipeline without building an index, you can simply use rank function and pass the queries and documents to rerank:

from pylate import rank, models

queries = [
    "query A",
    "query B",
]

documents = [
    ["document A", "document B"],
    ["document 1", "document C", "document B"],
]

documents_ids = [
    [1, 2],
    [1, 3, 2],
]

model = models.ColBERT(
    model_name_or_path=pylate_model_id,
)

queries_embeddings = model.encode(
    queries,
    is_query=True,
)

documents_embeddings = model.encode(
    documents,
    is_query=False,
)

reranked_documents = rank.rerank(
    documents_ids=documents_ids,
    queries_embeddings=queries_embeddings,
    documents_embeddings=documents_embeddings,
)

Evaluation

Metrics

Py Late Information Retrieval

  • Dataset: ['NanoDBPedia', 'NanoFiQA2018', 'NanoHotpotQA', 'NanoMSMARCO', 'NanoNQ', 'NanoSCIDOCS']
  • Evaluated with pylate.evaluation.pylate_information_retrieval_evaluator.PyLateInformationRetrievalEvaluator
Metric NanoDBPedia NanoFiQA2018 NanoHotpotQA NanoMSMARCO NanoNQ NanoSCIDOCS
MaxSim_accuracy@1 0.82 0.42 0.84 0.34 0.52 0.36
MaxSim_accuracy@3 0.92 0.54 0.96 0.54 0.68 0.62
MaxSim_accuracy@5 0.94 0.58 0.96 0.6 0.74 0.7
MaxSim_accuracy@10 0.98 0.7 0.96 0.74 0.8 0.74
MaxSim_precision@1 0.82 0.42 0.84 0.34 0.52 0.36
MaxSim_precision@3 0.6333 0.24 0.5067 0.18 0.2267 0.28
MaxSim_precision@5 0.572 0.18 0.32 0.12 0.152 0.236
MaxSim_precision@10 0.512 0.11 0.17 0.074 0.086 0.146
MaxSim_recall@1 0.1025 0.2334 0.42 0.34 0.49 0.0757
MaxSim_recall@3 0.2049 0.361 0.76 0.54 0.63 0.1727
MaxSim_recall@5 0.2539 0.4133 0.8 0.6 0.7 0.2427
MaxSim_recall@10 0.3696 0.5205 0.85 0.74 0.77 0.2987
MaxSim_ndcg@10 0.6463 0.4256 0.8032 0.534 0.6405 0.2994
MaxSim_mrr@10 0.8767 0.4915 0.8933 0.4695 0.6194 0.4989
MaxSim_map@100 0.5106 0.3578 0.7391 0.4789 0.593 0.2308

Pylate Custom Nano BEIR

  • Dataset: NanoBEIR_mean
  • Evaluated with pylate_nano_beir_evaluator.PylateCustomNanoBEIREvaluator
Metric Value
MaxSim_accuracy@1 0.55
MaxSim_accuracy@3 0.71
MaxSim_accuracy@5 0.7533
MaxSim_accuracy@10 0.82
MaxSim_precision@1 0.55
MaxSim_precision@3 0.3444
MaxSim_precision@5 0.2633
MaxSim_precision@10 0.183
MaxSim_recall@1 0.2769
MaxSim_recall@3 0.4448
MaxSim_recall@5 0.5016
MaxSim_recall@10 0.5915
MaxSim_ndcg@10 0.5582
MaxSim_mrr@10 0.6416
MaxSim_map@100 0.485

Training Details

Training Dataset

train

  • Dataset: train at 1072b6b
  • Size: 443,147 training samples
  • Columns: query_id, document_ids, and scores
  • Approximate statistics based on the first 1000 samples:
    query_id document_ids scores
    type string list list
    details
    • min: 5 tokens
    • mean: 5.83 tokens
    • max: 6 tokens
    • size: 32 elements
    • size: 32 elements
  • Samples:
    query_id document_ids scores
    817836 ['2716076', '6741935', '2681109', '5562684', '3507339', ...] [1.0, 0.7059561610221863, 0.21702419221401215, 0.38270196318626404, 0.20812414586544037, ...]
    1045170 ['5088671', '2953295', '8783471', '4268439', '6339935', ...] [1.0, 0.6493034362792969, 0.0692221149802208, 0.17963139712810516, 0.6697239875793457, ...]
    1069432 ['3724008', '314949', '8657336', '7420456', '879004', ...] [1.0, 0.3706032931804657, 0.3508036434650421, 0.2823200523853302, 0.17563475668430328, ...]
  • Loss: pylate.losses.distillation.Distillation

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 32
  • learning_rate: 3e-05
  • num_train_epochs: 1
  • bf16: True

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 32
  • per_device_eval_batch_size: 8
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 3e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 1
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.0
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: True
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss NanoDBPedia_MaxSim_ndcg@10 NanoFiQA2018_MaxSim_ndcg@10 NanoHotpotQA_MaxSim_ndcg@10 NanoMSMARCO_MaxSim_ndcg@10 NanoNQ_MaxSim_ndcg@10 NanoSCIDOCS_MaxSim_ndcg@10 NanoBEIR_mean_MaxSim_ndcg@10
0.0014 20 0.0332 - - - - - - -
0.0029 40 0.0297 - - - - - - -
0.0043 60 0.0285 - - - - - - -
0.0058 80 0.0289 - - - - - - -
0.0072 100 0.0278 - - - - - - -
0.0087 120 0.0271 - - - - - - -
0.0101 140 0.0267 - - - - - - -
0.0116 160 0.0269 - - - - - - -
0.0130 180 0.0264 - - - - - - -
0.0144 200 0.0262 - - - - - - -
0.0159 220 0.0259 - - - - - - -
0.0173 240 0.026 - - - - - - -
0.0188 260 0.0251 - - - - - - -
0.0202 280 0.025 - - - - - - -
0.0217 300 0.025 - - - - - - -
0.0231 320 0.0261 - - - - - - -
0.0246 340 0.0249 - - - - - - -
0.0260 360 0.0249 - - - - - - -
0.0274 380 0.0243 - - - - - - -
0.0289 400 0.0245 - - - - - - -
0.0303 420 0.0247 - - - - - - -
0.0318 440 0.0245 - - - - - - -
0.0332 460 0.0234 - - - - - - -
0.0347 480 0.0235 - - - - - - -
0.0361 500 0.0245 0.6352 0.3462 0.7674 0.5245 0.6220 0.2769 0.5287
0.0375 520 0.0244 - - - - - - -
0.0390 540 0.0233 - - - - - - -
0.0404 560 0.0239 - - - - - - -
0.0419 580 0.0232 - - - - - - -
0.0433 600 0.0225 - - - - - - -
0.0448 620 0.0234 - - - - - - -
0.0462 640 0.0245 - - - - - - -
0.0477 660 0.0229 - - - - - - -
0.0491 680 0.0232 - - - - - - -
0.0505 700 0.023 - - - - - - -
0.0520 720 0.0238 - - - - - - -
0.0534 740 0.0239 - - - - - - -
0.0549 760 0.0229 - - - - - - -
0.0563 780 0.0237 - - - - - - -
0.0578 800 0.0236 - - - - - - -
0.0592 820 0.0224 - - - - - - -
0.0607 840 0.0226 - - - - - - -
0.0621 860 0.0225 - - - - - - -
0.0635 880 0.023 - - - - - - -
0.0650 900 0.0232 - - - - - - -
0.0664 920 0.0224 - - - - - - -
0.0679 940 0.0227 - - - - - - -
0.0693 960 0.0231 - - - - - - -
0.0708 980 0.0238 - - - - - - -
0.0722 1000 0.0224 0.6463 0.3673 0.7878 0.5283 0.6466 0.2869 0.5439
0.0737 1020 0.0225 - - - - - - -
0.0751 1040 0.0223 - - - - - - -
0.0765 1060 0.023 - - - - - - -
0.0780 1080 0.0218 - - - - - - -
0.0794 1100 0.0228 - - - - - - -
0.0809 1120 0.0219 - - - - - - -
0.0823 1140 0.0225 - - - - - - -
0.0838 1160 0.0231 - - - - - - -
0.0852 1180 0.0233 - - - - - - -
0.0866 1200 0.0224 - - - - - - -
0.0881 1220 0.0223 - - - - - - -
0.0895 1240 0.0216 - - - - - - -
0.0910 1260 0.0227 - - - - - - -
0.0924 1280 0.0218 - - - - - - -
0.0939 1300 0.0222 - - - - - - -
0.0953 1320 0.0218 - - - - - - -
0.0968 1340 0.0216 - - - - - - -
0.0982 1360 0.0227 - - - - - - -
0.0996 1380 0.0211 - - - - - - -
0.1011 1400 0.022 - - - - - - -
0.1025 1420 0.0211 - - - - - - -
0.1040 1440 0.0219 - - - - - - -
0.1054 1460 0.023 - - - - - - -
0.1069 1480 0.0215 - - - - - - -
0.1083 1500 0.022 0.6323 0.3640 0.7771 0.4895 0.6553 0.2903 0.5348
0.1098 1520 0.0222 - - - - - - -
0.1112 1540 0.0222 - - - - - - -
0.1126 1560 0.0227 - - - - - - -
0.1141 1580 0.0225 - - - - - - -
0.1155 1600 0.0222 - - - - - - -
0.1170 1620 0.0217 - - - - - - -
0.1184 1640 0.0217 - - - - - - -
0.1199 1660 0.0224 - - - - - - -
0.1213 1680 0.0215 - - - - - - -
0.1228 1700 0.022 - - - - - - -
0.1242 1720 0.0222 - - - - - - -
0.1256 1740 0.0208 - - - - - - -
0.1271 1760 0.0224 - - - - - - -
0.1285 1780 0.0205 - - - - - - -
0.1300 1800 0.0214 - - - - - - -
0.1314 1820 0.0212 - - - - - - -
0.1329 1840 0.0207 - - - - - - -
0.1343 1860 0.0213 - - - - - - -
0.1357 1880 0.0211 - - - - - - -
0.1372 1900 0.0215 - - - - - - -
0.1386 1920 0.0218 - - - - - - -
0.1401 1940 0.0216 - - - - - - -
0.1415 1960 0.022 - - - - - - -
0.1430 1980 0.0222 - - - - - - -
0.1444 2000 0.0217 0.6472 0.3492 0.7873 0.5109 0.6687 0.3043 0.5446
0.1459 2020 0.022 - - - - - - -
0.1473 2040 0.0204 - - - - - - -
0.1487 2060 0.0215 - - - - - - -
0.1502 2080 0.0215 - - - - - - -
0.1516 2100 0.0217 - - - - - - -
0.1531 2120 0.0214 - - - - - - -
0.1545 2140 0.0217 - - - - - - -
0.1560 2160 0.022 - - - - - - -
0.1574 2180 0.0211 - - - - - - -
0.1589 2200 0.0212 - - - - - - -
0.1603 2220 0.0215 - - - - - - -
0.1617 2240 0.0212 - - - - - - -
0.1632 2260 0.0206 - - - - - - -
0.1646 2280 0.0213 - - - - - - -
0.1661 2300 0.0216 - - - - - - -
0.1675 2320 0.0219 - - - - - - -
0.1690 2340 0.0214 - - - - - - -
0.1704 2360 0.0206 - - - - - - -
0.1719 2380 0.0209 - - - - - - -
0.1733 2400 0.0216 - - - - - - -
0.1747 2420 0.0211 - - - - - - -
0.1762 2440 0.0198 - - - - - - -
0.1776 2460 0.0207 - - - - - - -
0.1791 2480 0.0218 - - - - - - -
0.1805 2500 0.0211 0.6445 0.3645 0.7612 0.5291 0.6565 0.2904 0.5411
0.1820 2520 0.0222 - - - - - - -
0.1834 2540 0.021 - - - - - - -
0.1849 2560 0.021 - - - - - - -
0.1863 2580 0.0213 - - - - - - -
0.1877 2600 0.0214 - - - - - - -
0.1892 2620 0.0216 - - - - - - -
0.1906 2640 0.0206 - - - - - - -
0.1921 2660 0.021 - - - - - - -
0.1935 2680 0.0213 - - - - - - -
0.1950 2700 0.0207 - - - - - - -
0.1964 2720 0.0214 - - - - - - -
0.1978 2740 0.0202 - - - - - - -
0.1993 2760 0.0201 - - - - - - -
0.2007 2780 0.0204 - - - - - - -
0.2022 2800 0.0207 - - - - - - -
0.2036 2820 0.0212 - - - - - - -
0.2051 2840 0.0205 - - - - - - -
0.2065 2860 0.0206 - - - - - - -
0.2080 2880 0.0205 - - - - - - -
0.2094 2900 0.0211 - - - - - - -
0.2108 2920 0.0209 - - - - - - -
0.2123 2940 0.0209 - - - - - - -
0.2137 2960 0.0213 - - - - - - -
0.2152 2980 0.0205 - - - - - - -
0.2166 3000 0.0201 0.6543 0.4016 0.7867 0.5219 0.6615 0.2656 0.5486
0.2181 3020 0.0221 - - - - - - -
0.2195 3040 0.0207 - - - - - - -
0.2210 3060 0.0208 - - - - - - -
0.2224 3080 0.0209 - - - - - - -
0.2238 3100 0.0209 - - - - - - -
0.2253 3120 0.0206 - - - - - - -
0.2267 3140 0.0203 - - - - - - -
0.2282 3160 0.0206 - - - - - - -
0.2296 3180 0.0207 - - - - - - -
0.2311 3200 0.0211 - - - - - - -
0.2325 3220 0.0213 - - - - - - -
0.2340 3240 0.0203 - - - - - - -
0.2354 3260 0.0205 - - - - - - -
0.2368 3280 0.0219 - - - - - - -
0.2383 3300 0.0197 - - - - - - -
0.2397 3320 0.0207 - - - - - - -
0.2412 3340 0.0205 - - - - - - -
0.2426 3360 0.0208 - - - - - - -
0.2441 3380 0.0201 - - - - - - -
0.2455 3400 0.0213 - - - - - - -
0.2469 3420 0.0207 - - - - - - -
0.2484 3440 0.02 - - - - - - -
0.2498 3460 0.0204 - - - - - - -
0.2513 3480 0.0201 - - - - - - -
0.2527 3500 0.0211 0.6481 0.3850 0.7743 0.5167 0.6401 0.2750 0.5399
0.2542 3520 0.021 - - - - - - -
0.2556 3540 0.0208 - - - - - - -
0.2571 3560 0.02 - - - - - - -
0.2585 3580 0.0211 - - - - - - -
0.2599 3600 0.0199 - - - - - - -
0.2614 3620 0.0192 - - - - - - -
0.2628 3640 0.0203 - - - - - - -
0.2643 3660 0.0197 - - - - - - -
0.2657 3680 0.0196 - - - - - - -
0.2672 3700 0.0198 - - - - - - -
0.2686 3720 0.0213 - - - - - - -
0.2701 3740 0.0199 - - - - - - -
0.2715 3760 0.0205 - - - - - - -
0.2729 3780 0.0205 - - - - - - -
0.2744 3800 0.0207 - - - - - - -
0.2758 3820 0.0204 - - - - - - -
0.2773 3840 0.0209 - - - - - - -
0.2787 3860 0.0211 - - - - - - -
0.2802 3880 0.0199 - - - - - - -
0.2816 3900 0.0212 - - - - - - -
0.2831 3920 0.0194 - - - - - - -
0.2845 3940 0.0196 - - - - - - -
0.2859 3960 0.0211 - - - - - - -
0.2874 3980 0.0198 - - - - - - -
0.2888 4000 0.0207 0.6402 0.3955 0.7813 0.4997 0.6360 0.2845 0.5395
0.2903 4020 0.0193 - - - - - - -
0.2917 4040 0.0198 - - - - - - -
0.2932 4060 0.0208 - - - - - - -
0.2946 4080 0.02 - - - - - - -
0.2961 4100 0.0202 - - - - - - -
0.2975 4120 0.0198 - - - - - - -
0.2989 4140 0.0193 - - - - - - -
0.3004 4160 0.0202 - - - - - - -
0.3018 4180 0.0198 - - - - - - -
0.3033 4200 0.0198 - - - - - - -
0.3047 4220 0.0197 - - - - - - -
0.3062 4240 0.0198 - - - - - - -
0.3076 4260 0.0191 - - - - - - -
0.3090 4280 0.019 - - - - - - -
0.3105 4300 0.0194 - - - - - - -
0.3119 4320 0.0207 - - - - - - -
0.3134 4340 0.019 - - - - - - -
0.3148 4360 0.0202 - - - - - - -
0.3163 4380 0.0202 - - - - - - -
0.3177 4400 0.0204 - - - - - - -
0.3192 4420 0.02 - - - - - - -
0.3206 4440 0.0198 - - - - - - -
0.3220 4460 0.0191 - - - - - - -
0.3235 4480 0.02 - - - - - - -
0.3249 4500 0.0199 0.6381 0.4037 0.7803 0.5196 0.6260 0.2848 0.5421
0.3264 4520 0.0209 - - - - - - -
0.3278 4540 0.0207 - - - - - - -
0.3293 4560 0.0204 - - - - - - -
0.3307 4580 0.0197 - - - - - - -
0.3322 4600 0.0198 - - - - - - -
0.3336 4620 0.0198 - - - - - - -
0.3350 4640 0.0194 - - - - - - -
0.3365 4660 0.0201 - - - - - - -
0.3379 4680 0.0197 - - - - - - -
0.3394 4700 0.0195 - - - - - - -
0.3408 4720 0.0187 - - - - - - -
0.3423 4740 0.0194 - - - - - - -
0.3437 4760 0.0192 - - - - - - -
0.3452 4780 0.0202 - - - - - - -
0.3466 4800 0.0191 - - - - - - -
0.3480 4820 0.0194 - - - - - - -
0.3495 4840 0.0205 - - - - - - -
0.3509 4860 0.019 - - - - - - -
0.3524 4880 0.0202 - - - - - - -
0.3538 4900 0.0191 - - - - - - -
0.3553 4920 0.0194 - - - - - - -
0.3567 4940 0.0192 - - - - - - -
0.3581 4960 0.0195 - - - - - - -
0.3596 4980 0.0197 - - - - - - -
0.3610 5000 0.0202 0.6362 0.3887 0.7957 0.5114 0.6366 0.2755 0.5407
0.3625 5020 0.0196 - - - - - - -
0.3639 5040 0.0203 - - - - - - -
0.3654 5060 0.0201 - - - - - - -
0.3668 5080 0.0193 - - - - - - -
0.3683 5100 0.019 - - - - - - -
0.3697 5120 0.0195 - - - - - - -
0.3711 5140 0.0197 - - - - - - -
0.3726 5160 0.0198 - - - - - - -
0.3740 5180 0.0198 - - - - - - -
0.3755 5200 0.0203 - - - - - - -
0.3769 5220 0.0192 - - - - - - -
0.3784 5240 0.0202 - - - - - - -
0.3798 5260 0.02 - - - - - - -
0.3813 5280 0.0198 - - - - - - -
0.3827 5300 0.0189 - - - - - - -
0.3841 5320 0.0206 - - - - - - -
0.3856 5340 0.0196 - - - - - - -
0.3870 5360 0.0194 - - - - - - -
0.3885 5380 0.0194 - - - - - - -
0.3899 5400 0.0197 - - - - - - -
0.3914 5420 0.0196 - - - - - - -
0.3928 5440 0.0203 - - - - - - -
0.3943 5460 0.0196 - - - - - - -
0.3957 5480 0.0206 - - - - - - -
0.3971 5500 0.0196 0.6268 0.4017 0.7928 0.5383 0.6415 0.2983 0.5499
0.3986 5520 0.0191 - - - - - - -
0.4000 5540 0.0194 - - - - - - -
0.4015 5560 0.0193 - - - - - - -
0.4029 5580 0.0197 - - - - - - -
0.4044 5600 0.0196 - - - - - - -
0.4058 5620 0.0194 - - - - - - -
0.4072 5640 0.0201 - - - - - - -
0.4087 5660 0.0199 - - - - - - -
0.4101 5680 0.0197 - - - - - - -
0.4116 5700 0.0189 - - - - - - -
0.4130 5720 0.0193 - - - - - - -
0.4145 5740 0.021 - - - - - - -
0.4159 5760 0.0199 - - - - - - -
0.4174 5780 0.0205 - - - - - - -
0.4188 5800 0.0195 - - - - - - -
0.4202 5820 0.0195 - - - - - - -
0.4217 5840 0.0185 - - - - - - -
0.4231 5860 0.0193 - - - - - - -
0.4246 5880 0.0196 - - - - - - -
0.4260 5900 0.0191 - - - - - - -
0.4275 5920 0.0195 - - - - - - -
0.4289 5940 0.0201 - - - - - - -
0.4304 5960 0.0196 - - - - - - -
0.4318 5980 0.0204 - - - - - - -
0.4332 6000 0.0186 0.6559 0.4073 0.8023 0.5411 0.6544 0.2962 0.5595
0.4347 6020 0.0184 - - - - - - -
0.4361 6040 0.0196 - - - - - - -
0.4376 6060 0.0185 - - - - - - -
0.4390 6080 0.0196 - - - - - - -
0.4405 6100 0.0197 - - - - - - -
0.4419 6120 0.0201 - - - - - - -
0.4434 6140 0.0195 - - - - - - -
0.4448 6160 0.0188 - - - - - - -
0.4462 6180 0.0192 - - - - - - -
0.4477 6200 0.0191 - - - - - - -
0.4491 6220 0.0191 - - - - - - -
0.4506 6240 0.0193 - - - - - - -
0.4520 6260 0.0195 - - - - - - -
0.4535 6280 0.0188 - - - - - - -
0.4549 6300 0.0198 - - - - - - -
0.4564 6320 0.0192 - - - - - - -
0.4578 6340 0.0193 - - - - - - -
0.4592 6360 0.0199 - - - - - - -
0.4607 6380 0.0194 - - - - - - -
0.4621 6400 0.0207 - - - - - - -
0.4636 6420 0.0193 - - - - - - -
0.4650 6440 0.0198 - - - - - - -
0.4665 6460 0.0185 - - - - - - -
0.4679 6480 0.0205 - - - - - - -
0.4693 6500 0.0194 0.6413 0.4048 0.7962 0.5413 0.6646 0.2982 0.5577
0.4708 6520 0.0185 - - - - - - -
0.4722 6540 0.0196 - - - - - - -
0.4737 6560 0.0191 - - - - - - -
0.4751 6580 0.019 - - - - - - -
0.4766 6600 0.0195 - - - - - - -
0.4780 6620 0.0195 - - - - - - -
0.4795 6640 0.0195 - - - - - - -
0.4809 6660 0.0193 - - - - - - -
0.4823 6680 0.0193 - - - - - - -
0.4838 6700 0.0195 - - - - - - -
0.4852 6720 0.0196 - - - - - - -
0.4867 6740 0.0177 - - - - - - -
0.4881 6760 0.0181 - - - - - - -
0.4896 6780 0.0195 - - - - - - -
0.4910 6800 0.0189 - - - - - - -
0.4925 6820 0.0195 - - - - - - -
0.4939 6840 0.0183 - - - - - - -
0.4953 6860 0.0201 - - - - - - -
0.4968 6880 0.0192 - - - - - - -
0.4982 6900 0.0191 - - - - - - -
0.4997 6920 0.0194 - - - - - - -
0.5011 6940 0.0189 - - - - - - -
0.5026 6960 0.0198 - - - - - - -
0.5040 6980 0.0185 - - - - - - -
0.5055 7000 0.0197 0.6441 0.3793 0.7954 0.5223 0.6622 0.3075 0.5518
0.5069 7020 0.0196 - - - - - - -
0.5083 7040 0.0195 - - - - - - -
0.5098 7060 0.0195 - - - - - - -
0.5112 7080 0.02 - - - - - - -
0.5127 7100 0.0195 - - - - - - -
0.5141 7120 0.0194 - - - - - - -
0.5156 7140 0.019 - - - - - - -
0.5170 7160 0.0201 - - - - - - -
0.5184 7180 0.0184 - - - - - - -
0.5199 7200 0.0188 - - - - - - -
0.5213 7220 0.0201 - - - - - - -
0.5228 7240 0.0182 - - - - - - -
0.5242 7260 0.0195 - - - - - - -
0.5257 7280 0.019 - - - - - - -
0.5271 7300 0.019 - - - - - - -
0.5286 7320 0.0185 - - - - - - -
0.5300 7340 0.0189 - - - - - - -
0.5314 7360 0.0188 - - - - - - -
0.5329 7380 0.0187 - - - - - - -
0.5343 7400 0.0179 - - - - - - -
0.5358 7420 0.0191 - - - - - - -
0.5372 7440 0.0187 - - - - - - -
0.5387 7460 0.0181 - - - - - - -
0.5401 7480 0.0191 - - - - - - -
0.5416 7500 0.0176 0.6409 0.3827 0.7882 0.5272 0.6503 0.2934 0.5471
0.5430 7520 0.0203 - - - - - - -
0.5444 7540 0.0184 - - - - - - -
0.5459 7560 0.019 - - - - - - -
0.5473 7580 0.019 - - - - - - -
0.5488 7600 0.0194 - - - - - - -
0.5502 7620 0.0187 - - - - - - -
0.5517 7640 0.0185 - - - - - - -
0.5531 7660 0.0194 - - - - - - -
0.5546 7680 0.0192 - - - - - - -
0.5560 7700 0.0191 - - - - - - -
0.5574 7720 0.0178 - - - - - - -
0.5589 7740 0.0181 - - - - - - -
0.5603 7760 0.0186 - - - - - - -
0.5618 7780 0.0184 - - - - - - -
0.5632 7800 0.0189 - - - - - - -
0.5647 7820 0.0189 - - - - - - -
0.5661 7840 0.0189 - - - - - - -
0.5676 7860 0.0186 - - - - - - -
0.5690 7880 0.018 - - - - - - -
0.5704 7900 0.0186 - - - - - - -
0.5719 7920 0.0187 - - - - - - -
0.5733 7940 0.0189 - - - - - - -
0.5748 7960 0.0198 - - - - - - -
0.5762 7980 0.0191 - - - - - - -
0.5777 8000 0.0177 0.6439 0.3972 0.7947 0.5342 0.6556 0.2936 0.5532
0.5791 8020 0.0197 - - - - - - -
0.5805 8040 0.0195 - - - - - - -
0.5820 8060 0.0185 - - - - - - -
0.5834 8080 0.0191 - - - - - - -
0.5849 8100 0.0187 - - - - - - -
0.5863 8120 0.0182 - - - - - - -
0.5878 8140 0.0181 - - - - - - -
0.5892 8160 0.019 - - - - - - -
0.5907 8180 0.0189 - - - - - - -
0.5921 8200 0.0197 - - - - - - -
0.5935 8220 0.0183 - - - - - - -
0.5950 8240 0.0191 - - - - - - -
0.5964 8260 0.0188 - - - - - - -
0.5979 8280 0.0195 - - - - - - -
0.5993 8300 0.0191 - - - - - - -
0.6008 8320 0.0185 - - - - - - -
0.6022 8340 0.0185 - - - - - - -
0.6037 8360 0.0186 - - - - - - -
0.6051 8380 0.0178 - - - - - - -
0.6065 8400 0.0182 - - - - - - -
0.6080 8420 0.0196 - - - - - - -
0.6094 8440 0.019 - - - - - - -
0.6109 8460 0.0198 - - - - - - -
0.6123 8480 0.0188 - - - - - - -
0.6138 8500 0.0192 0.6490 0.3983 0.7778 0.5157 0.6601 0.2980 0.5498
0.6152 8520 0.0186 - - - - - - -
0.6167 8540 0.0194 - - - - - - -
0.6181 8560 0.0188 - - - - - - -
0.6195 8580 0.0193 - - - - - - -
0.6210 8600 0.0185 - - - - - - -
0.6224 8620 0.0194 - - - - - - -
0.6239 8640 0.0187 - - - - - - -
0.6253 8660 0.0194 - - - - - - -
0.6268 8680 0.018 - - - - - - -
0.6282 8700 0.0182 - - - - - - -
0.6296 8720 0.0191 - - - - - - -
0.6311 8740 0.0179 - - - - - - -
0.6325 8760 0.0191 - - - - - - -
0.6340 8780 0.0197 - - - - - - -
0.6354 8800 0.0188 - - - - - - -
0.6369 8820 0.0188 - - - - - - -
0.6383 8840 0.018 - - - - - - -
0.6398 8860 0.0188 - - - - - - -
0.6412 8880 0.0193 - - - - - - -
0.6426 8900 0.0181 - - - - - - -
0.6441 8920 0.0187 - - - - - - -
0.6455 8940 0.0187 - - - - - - -
0.6470 8960 0.0183 - - - - - - -
0.6484 8980 0.0189 - - - - - - -
0.6499 9000 0.0186 0.6369 0.4054 0.7856 0.5233 0.6619 0.2899 0.5505
0.6513 9020 0.0187 - - - - - - -
0.6528 9040 0.0192 - - - - - - -
0.6542 9060 0.0188 - - - - - - -
0.6556 9080 0.0192 - - - - - - -
0.6571 9100 0.0182 - - - - - - -
0.6585 9120 0.019 - - - - - - -
0.6600 9140 0.0181 - - - - - - -
0.6614 9160 0.0182 - - - - - - -
0.6629 9180 0.0191 - - - - - - -
0.6643 9200 0.0183 - - - - - - -
0.6658 9220 0.019 - - - - - - -
0.6672 9240 0.019 - - - - - - -
0.6686 9260 0.0184 - - - - - - -
0.6701 9280 0.0187 - - - - - - -
0.6715 9300 0.0182 - - - - - - -
0.6730 9320 0.0191 - - - - - - -
0.6744 9340 0.0187 - - - - - - -
0.6759 9360 0.0194 - - - - - - -
0.6773 9380 0.0196 - - - - - - -
0.6787 9400 0.0181 - - - - - - -
0.6802 9420 0.0188 - - - - - - -
0.6816 9440 0.0189 - - - - - - -
0.6831 9460 0.0189 - - - - - - -
0.6845 9480 0.0183 - - - - - - -
0.6860 9500 0.0196 0.6380 0.3851 0.7799 0.5238 0.6547 0.2905 0.5453
0.6874 9520 0.0181 - - - - - - -
0.6889 9540 0.0177 - - - - - - -
0.6903 9560 0.0188 - - - - - - -
0.6917 9580 0.0188 - - - - - - -
0.6932 9600 0.018 - - - - - - -
0.6946 9620 0.0194 - - - - - - -
0.6961 9640 0.0183 - - - - - - -
0.6975 9660 0.0188 - - - - - - -
0.6990 9680 0.0172 - - - - - - -
0.7004 9700 0.02 - - - - - - -
0.7019 9720 0.0182 - - - - - - -
0.7033 9740 0.019 - - - - - - -
0.7047 9760 0.0184 - - - - - - -
0.7062 9780 0.0182 - - - - - - -
0.7076 9800 0.0197 - - - - - - -
0.7091 9820 0.0183 - - - - - - -
0.7105 9840 0.0187 - - - - - - -
0.7120 9860 0.0188 - - - - - - -
0.7134 9880 0.0191 - - - - - - -
0.7149 9900 0.0181 - - - - - - -
0.7163 9920 0.0187 - - - - - - -
0.7177 9940 0.0184 - - - - - - -
0.7192 9960 0.018 - - - - - - -
0.7206 9980 0.0195 - - - - - - -
0.7221 10000 0.0185 0.6482 0.3947 0.7846 0.5298 0.6606 0.2917 0.5516
0.7235 10020 0.0195 - - - - - - -
0.7250 10040 0.019 - - - - - - -
0.7264 10060 0.0191 - - - - - - -
0.7279 10080 0.0187 - - - - - - -
0.7293 10100 0.0181 - - - - - - -
0.7307 10120 0.0181 - - - - - - -
0.7322 10140 0.0186 - - - - - - -
0.7336 10160 0.0174 - - - - - - -
0.7351 10180 0.0194 - - - - - - -
0.7365 10200 0.0177 - - - - - - -
0.7380 10220 0.0193 - - - - - - -
0.7394 10240 0.0189 - - - - - - -
0.7408 10260 0.0184 - - - - - - -
0.7423 10280 0.0184 - - - - - - -
0.7437 10300 0.0185 - - - - - - -
0.7452 10320 0.018 - - - - - - -
0.7466 10340 0.0186 - - - - - - -
0.7481 10360 0.0177 - - - - - - -
0.7495 10380 0.0192 - - - - - - -
0.7510 10400 0.0183 - - - - - - -
0.7524 10420 0.0193 - - - - - - -
0.7538 10440 0.019 - - - - - - -
0.7553 10460 0.0179 - - - - - - -
0.7567 10480 0.0181 - - - - - - -
0.7582 10500 0.0189 0.6356 0.4051 0.7820 0.5329 0.6592 0.2945 0.5516
0.7596 10520 0.0192 - - - - - - -
0.7611 10540 0.0183 - - - - - - -
0.7625 10560 0.0187 - - - - - - -
0.7640 10580 0.0186 - - - - - - -
0.7654 10600 0.0187 - - - - - - -
0.7668 10620 0.0191 - - - - - - -
0.7683 10640 0.0181 - - - - - - -
0.7697 10660 0.0186 - - - - - - -
0.7712 10680 0.0193 - - - - - - -
0.7726 10700 0.0185 - - - - - - -
0.7741 10720 0.0181 - - - - - - -
0.7755 10740 0.0186 - - - - - - -
0.7770 10760 0.019 - - - - - - -
0.7784 10780 0.0172 - - - - - - -
0.7798 10800 0.0192 - - - - - - -
0.7813 10820 0.0183 - - - - - - -
0.7827 10840 0.0186 - - - - - - -
0.7842 10860 0.0191 - - - - - - -
0.7856 10880 0.0184 - - - - - - -
0.7871 10900 0.0188 - - - - - - -
0.7885 10920 0.0183 - - - - - - -
0.7899 10940 0.0178 - - - - - - -
0.7914 10960 0.0182 - - - - - - -
0.7928 10980 0.0177 - - - - - - -
0.7943 11000 0.0187 0.6390 0.4119 0.7876 0.5334 0.6384 0.2980 0.5514
0.7957 11020 0.0186 - - - - - - -
0.7972 11040 0.0186 - - - - - - -
0.7986 11060 0.0183 - - - - - - -
0.8001 11080 0.0179 - - - - - - -
0.8015 11100 0.0188 - - - - - - -
0.8029 11120 0.0186 - - - - - - -
0.8044 11140 0.0176 - - - - - - -
0.8058 11160 0.0185 - - - - - - -
0.8073 11180 0.0187 - - - - - - -
0.8087 11200 0.0179 - - - - - - -
0.8102 11220 0.0178 - - - - - - -
0.8116 11240 0.0186 - - - - - - -
0.8131 11260 0.0179 - - - - - - -
0.8145 11280 0.0181 - - - - - - -
0.8159 11300 0.0191 - - - - - - -
0.8174 11320 0.0187 - - - - - - -
0.8188 11340 0.0185 - - - - - - -
0.8203 11360 0.0178 - - - - - - -
0.8217 11380 0.018 - - - - - - -
0.8232 11400 0.0182 - - - - - - -
0.8246 11420 0.018 - - - - - - -
0.8261 11440 0.018 - - - - - - -
0.8275 11460 0.0184 - - - - - - -
0.8289 11480 0.0175 - - - - - - -
0.8304 11500 0.0181 0.6360 0.4185 0.7888 0.5268 0.6678 0.2930 0.5552
0.8318 11520 0.0176 - - - - - - -
0.8333 11540 0.0183 - - - - - - -
0.8347 11560 0.0182 - - - - - - -
0.8362 11580 0.0189 - - - - - - -
0.8376 11600 0.0188 - - - - - - -
0.8390 11620 0.0182 - - - - - - -
0.8405 11640 0.0189 - - - - - - -
0.8419 11660 0.0181 - - - - - - -
0.8434 11680 0.0178 - - - - - - -
0.8448 11700 0.0183 - - - - - - -
0.8463 11720 0.018 - - - - - - -
0.8477 11740 0.0181 - - - - - - -
0.8492 11760 0.0182 - - - - - - -
0.8506 11780 0.0192 - - - - - - -
0.8520 11800 0.0188 - - - - - - -
0.8535 11820 0.0188 - - - - - - -
0.8549 11840 0.018 - - - - - - -
0.8564 11860 0.0179 - - - - - - -
0.8578 11880 0.0174 - - - - - - -
0.8593 11900 0.018 - - - - - - -
0.8607 11920 0.0176 - - - - - - -
0.8622 11940 0.0175 - - - - - - -
0.8636 11960 0.0187 - - - - - - -
0.8650 11980 0.0182 - - - - - - -
0.8665 12000 0.0185 0.6476 0.4064 0.8021 0.5229 0.6482 0.2936 0.5535
0.8679 12020 0.0191 - - - - - - -
0.8694 12040 0.0188 - - - - - - -
0.8708 12060 0.0177 - - - - - - -
0.8723 12080 0.0188 - - - - - - -
0.8737 12100 0.018 - - - - - - -
0.8752 12120 0.0177 - - - - - - -
0.8766 12140 0.0184 - - - - - - -
0.8780 12160 0.0199 - - - - - - -
0.8795 12180 0.0182 - - - - - - -
0.8809 12200 0.0182 - - - - - - -
0.8824 12220 0.0189 - - - - - - -
0.8838 12240 0.0189 - - - - - - -
0.8853 12260 0.0184 - - - - - - -
0.8867 12280 0.0178 - - - - - - -
0.8882 12300 0.0179 - - - - - - -
0.8896 12320 0.0177 - - - - - - -
0.8910 12340 0.0185 - - - - - - -
0.8925 12360 0.0181 - - - - - - -
0.8939 12380 0.0183 - - - - - - -
0.8954 12400 0.018 - - - - - - -
0.8968 12420 0.0176 - - - - - - -
0.8983 12440 0.0186 - - - - - - -
0.8997 12460 0.0184 - - - - - - -
0.9011 12480 0.0193 - - - - - - -
0.9026 12500 0.018 0.6434 0.4223 0.8035 0.5354 0.6496 0.2928 0.5578
0.9040 12520 0.0183 - - - - - - -
0.9055 12540 0.0188 - - - - - - -
0.9069 12560 0.0178 - - - - - - -
0.9084 12580 0.0187 - - - - - - -
0.9098 12600 0.019 - - - - - - -
0.9113 12620 0.0177 - - - - - - -
0.9127 12640 0.0185 - - - - - - -
0.9141 12660 0.0176 - - - - - - -
0.9156 12680 0.0185 - - - - - - -
0.9170 12700 0.0188 - - - - - - -
0.9185 12720 0.0177 - - - - - - -
0.9199 12740 0.0174 - - - - - - -
0.9214 12760 0.0183 - - - - - - -
0.9228 12780 0.0196 - - - - - - -
0.9243 12800 0.0185 - - - - - - -
0.9257 12820 0.0178 - - - - - - -
0.9271 12840 0.0187 - - - - - - -
0.9286 12860 0.0184 - - - - - - -
0.9300 12880 0.0187 - - - - - - -
0.9315 12900 0.0178 - - - - - - -
0.9329 12920 0.0186 - - - - - - -
0.9344 12940 0.0193 - - - - - - -
0.9358 12960 0.0181 - - - - - - -
0.9373 12980 0.0182 - - - - - - -
0.9387 13000 0.0184 0.6505 0.4242 0.8013 0.5332 0.6408 0.2964 0.5577
0.9401 13020 0.0184 - - - - - - -
0.9416 13040 0.019 - - - - - - -
0.9430 13060 0.0177 - - - - - - -
0.9445 13080 0.0182 - - - - - - -
0.9459 13100 0.0183 - - - - - - -
0.9474 13120 0.0176 - - - - - - -
0.9488 13140 0.0178 - - - - - - -
0.9502 13160 0.0183 - - - - - - -
0.9517 13180 0.0187 - - - - - - -
0.9531 13200 0.0177 - - - - - - -
0.9546 13220 0.0185 - - - - - - -
0.9560 13240 0.0192 - - - - - - -
0.9575 13260 0.0183 - - - - - - -
0.9589 13280 0.0177 - - - - - - -
0.9604 13300 0.0185 - - - - - - -
0.9618 13320 0.0173 - - - - - - -
0.9632 13340 0.0175 - - - - - - -
0.9647 13360 0.0189 - - - - - - -
0.9661 13380 0.0181 - - - - - - -
0.9676 13400 0.0186 - - - - - - -
0.9690 13420 0.0177 - - - - - - -
0.9705 13440 0.0186 - - - - - - -
0.9719 13460 0.0185 - - - - - - -
0.9734 13480 0.0183 - - - - - - -
0.9748 13500 0.0193 0.6463 0.4256 0.8032 0.5340 0.6405 0.2994 0.5582
0.9762 13520 0.0177 - - - - - - -
0.9777 13540 0.0182 - - - - - - -
0.9791 13560 0.0177 - - - - - - -
0.9806 13580 0.0181 - - - - - - -
0.9820 13600 0.0182 - - - - - - -
0.9835 13620 0.0186 - - - - - - -
0.9849 13640 0.018 - - - - - - -
0.9864 13660 0.0181 - - - - - - -
0.9878 13680 0.0178 - - - - - - -
0.9892 13700 0.0179 - - - - - - -
0.9907 13720 0.0181 - - - - - - -
0.9921 13740 0.0181 - - - - - - -
0.9936 13760 0.0184 - - - - - - -
0.9950 13780 0.0183 - - - - - - -
0.9965 13800 0.0196 - - - - - - -
0.9979 13820 0.0177 - - - - - - -
0.9994 13840 0.0181 - - - - - - -

Framework Versions

  • Python: 3.11.12
  • Sentence Transformers: 4.0.2
  • PyLate: 1.2.0
  • Transformers: 4.48.2
  • PyTorch: 2.6.0+cu124
  • Accelerate: 1.6.0
  • Datasets: 3.6.0
  • Tokenizers: 0.21.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084"
}

PyLate

@misc{PyLate,
title={PyLate: Flexible Training and Retrieval for Late Interaction Models},
author={Chaffin, Antoine and Sourty, Raphaël},
url={https://github.com/lightonai/pylate},
year={2024}
}
Downloads last month
4
Safetensors
Model size
34.2M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Speedsy/turkish-multilingual-e5-small-32768-colbert-cleaned-data-32bsize-13849

Finetuned
(42)
this model

Evaluation results