SentenceTransformer based on sentence-transformers/all-MiniLM-L6-v2

This sentence-transformers model is a multilingual student model supporting English, Spanish, and Portuguese. For the student, an efficient MiniLM architecture was adopted, prized for its balance of performance and compact size (similar to the architecture used in models like sentence-transformers/all-MiniLM-L6-v2). This student was then trained by distilling knowledge from the high-performance BAAI/bge-small-en-v1.5 English embedding model, which served as the teacher. The resulting model maps sentences & paragraphs to a 384-dimensional dense vector space, suitable for tasks such as semantic textual similarity, semantic search, paraphrase mining, text classification, and clustering.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: sentence-transformers/all-MiniLM-L6-v2
  • Maximum Sequence Length: 128 tokens
  • Output Dimensionality: 384 dimensions
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
    'The Social Security takes another 8.7 billion euros from the Backup Fund',
    'La Seguridad Social saca otros 8.700 millones del Fondo de Reserva',
    'Tiene unos 44.000 millones de barriles de reservas de petroleo, y 54 billones de pies cubicos de reservas de gas natural.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Semantic Similarity

Metric stsb_multi_mt-pt stsb_multi_mt-en stsb_multi_mt-es
pearson_cosine 0.8087 0.836 0.8209
spearman_cosine 0.8245 0.8542 0.8434

Training Details

Training Dataset

Unnamed Dataset

  • Size: 37,000,000 training samples
  • Columns: english, non_english, and label
  • Approximate statistics based on the first 1000 samples:
    english non_english label
    type string string list
    details
    • min: 3 tokens
    • mean: 21.97 tokens
    • max: 103 tokens
    • min: 3 tokens
    • mean: 23.91 tokens
    • max: 118 tokens
    • size: 384 elements
  • Samples:
    english non_english label
    It also calls on the UN Secretary-General, to present by 15 April to the Security Council “options” for the deployment of an international police force. Asimismo, solicita al Secretario General de las Naciones Unidas que presente al Consejo de Seguridad, antes del 15 de abril, algunas "propuestas" para el despliegue de fuerzas de policía internacional. [-0.04722730070352554, -0.025426536798477173, 0.04836353287100792, -0.04443460330367088, 0.06477425247430801, ...]
    The Viacom Services governed by this privacy policy are generally not intended for use by children. Los Servicios de Viacom que están regidos por esta política de privacidad, por lo general, no están destinados a menores de edad. [0.0823400542140007, -0.004498262889683247, 0.023361222818493843, -0.07224256545305252, -0.0026566446758806705, ...]
    - You gotta promise me, doc. - Prometa-me, Doutor. [-0.010264741256833076, 0.004426243249326944, 0.06644191592931747, -0.03601944074034691, 0.009492351673543453, ...]
  • Loss: MSELoss

Evaluation Dataset

Unnamed Dataset

  • Size: 701,304 evaluation samples
  • Columns: english, non_english, and label
  • Approximate statistics based on the first 1000 samples:
    english non_english label
    type string string list
    details
    • min: 4 tokens
    • mean: 22.86 tokens
    • max: 102 tokens
    • min: 5 tokens
    • mean: 24.97 tokens
    • max: 100 tokens
    • size: 384 elements
  • Samples:
    english non_english label
    Manufacturer: Rourke Educational Media Gravadora: Rourke Meios Educativos [0.016672328114509583, 0.025462372228503227, 0.024576706811785698, -0.01961815170943737, 0.014413068071007729, ...]
    What in hells success if it isnt right there in your Stevenson sonnet, which outranks Henleys Apparition, in that Love-cycle, in those sea- poems? ¿Qué demonios es el éxito sino lo que hay en su soneto sobre Stevenson, superior a la Aparición de Henley, o en su Ciclo del amor y en sus Poemas del mar? [0.0172938983887434, -0.04857725650072098, -0.05557125806808472, -0.012614483945071697, -0.014296879060566425, ...]
    Everyone knows, and you know already that you are existing; there is no need, it is futile. Todo mundo sabe, e você já sabe que está existindo; não há nenhuma necessidade disso, isso é fútil. [-0.005980388727039099, -0.02314012683928013, 0.022277960553765297, -0.008318797685205936, -0.0034421393647789955, ...]
  • Loss: MSELoss

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 512
  • per_device_eval_batch_size: 512
  • gradient_accumulation_steps: 2
  • learning_rate: 0.0003
  • num_train_epochs: 8
  • warmup_ratio: 0.15
  • bf16: True
  • dataloader_num_workers: 8

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 512
  • per_device_eval_batch_size: 512
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 2
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 0.0003
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 8
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.15
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: True
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 8
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • tp_size: 0
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss Validation Loss stsb_multi_mt-pt_spearman_cosine stsb_multi_mt-en_spearman_cosine stsb_multi_mt-es_spearman_cosine
0.0138 500 0.007 - - - -
0.0277 1000 0.0048 - - - -
0.0415 1500 0.0042 - - - -
0.0554 2000 0.0038 - - - -
0.0692 2500 0.0035 - - - -
0.0830 3000 0.0034 - - - -
0.0969 3500 0.0032 - - - -
0.1107 4000 0.0031 - - - -
0.1245 4500 0.003 - - - -
0.1384 5000 0.0029 0.0014 0.3870 0.4251 0.4009
0.1522 5500 0.0027 - - - -
0.1661 6000 0.0026 - - - -
0.1799 6500 0.0025 - - - -
0.1937 7000 0.0024 - - - -
0.2076 7500 0.0024 - - - -
0.2214 8000 0.0023 - - - -
0.2352 8500 0.0022 - - - -
0.2491 9000 0.0021 - - - -
0.2629 9500 0.0021 - - - -
0.2768 10000 0.002 0.0009 0.5930 0.6429 0.6127
0.2906 10500 0.0019 - - - -
0.3044 11000 0.0019 - - - -
0.3183 11500 0.0018 - - - -
0.3321 12000 0.0018 - - - -
0.3459 12500 0.0017 - - - -
0.3598 13000 0.0017 - - - -
0.3736 13500 0.0016 - - - -
0.3875 14000 0.0016 - - - -
0.4013 14500 0.0015 - - - -
0.4151 15000 0.0015 0.0007 0.6874 0.7547 0.7104
0.4290 15500 0.0015 - - - -
0.4428 16000 0.0014 - - - -
0.4566 16500 0.0014 - - - -
0.4705 17000 0.0014 - - - -
0.4843 17500 0.0013 - - - -
0.4982 18000 0.0013 - - - -
0.5120 18500 0.0013 - - - -
0.5258 19000 0.0013 - - - -
0.5397 19500 0.0012 - - - -
0.5535 20000 0.0012 0.0006 0.7185 0.7866 0.7445
0.5673 20500 0.0012 - - - -
0.5812 21000 0.0012 - - - -
0.5950 21500 0.0012 - - - -
0.6089 22000 0.0012 - - - -
0.6227 22500 0.0012 - - - -
0.6365 23000 0.0011 - - - -
0.6504 23500 0.0011 - - - -
0.6642 24000 0.0011 - - - -
0.6781 24500 0.0011 - - - -
0.6919 25000 0.0011 0.0005 0.7381 0.8008 0.7642
0.7057 25500 0.0011 - - - -
0.7196 26000 0.0011 - - - -
0.7334 26500 0.0011 - - - -
0.7472 27000 0.0011 - - - -
0.7611 27500 0.001 - - - -
0.7749 28000 0.001 - - - -
0.7888 28500 0.001 - - - -
0.8026 29000 0.001 - - - -
0.8164 29500 0.001 - - - -
0.8303 30000 0.001 0.0005 0.7555 0.8184 0.7799
0.8441 30500 0.001 - - - -
0.8579 31000 0.001 - - - -
0.8718 31500 0.001 - - - -
0.8856 32000 0.001 - - - -
0.8995 32500 0.001 - - - -
0.9133 33000 0.001 - - - -
0.9271 33500 0.001 - - - -
0.9410 34000 0.001 - - - -
0.9548 34500 0.001 - - - -
0.9686 35000 0.001 0.0004 0.7628 0.8230 0.7883
0.9825 35500 0.001 - - - -
0.9963 36000 0.0009 - - - -
1.0102 36500 0.0009 - - - -
1.0240 37000 0.0009 - - - -
1.0378 37500 0.0009 - - - -
1.0517 38000 0.0009 - - - -
1.0655 38500 0.0009 - - - -
1.0793 39000 0.0009 - - - -
1.0932 39500 0.0009 - - - -
1.1070 40000 0.0009 0.0004 0.7701 0.8313 0.7975
1.1209 40500 0.0009 - - - -
1.1347 41000 0.0009 - - - -
1.1485 41500 0.0009 - - - -
1.1624 42000 0.0009 - - - -
1.1762 42500 0.0009 - - - -
1.1900 43000 0.0009 - - - -
1.2039 43500 0.0009 - - - -
1.2177 44000 0.0009 - - - -
1.2316 44500 0.0009 - - - -
1.2454 45000 0.0009 0.0004 0.7747 0.8344 0.7990
1.2592 45500 0.0009 - - - -
1.2731 46000 0.0009 - - - -
1.2869 46500 0.0009 - - - -
1.3008 47000 0.0009 - - - -
1.3146 47500 0.0009 - - - -
1.3284 48000 0.0009 - - - -
1.3423 48500 0.0009 - - - -
1.3561 49000 0.0009 - - - -
1.3699 49500 0.0009 - - - -
1.3838 50000 0.0009 0.0004 0.7824 0.8372 0.8111
1.3976 50500 0.0009 - - - -
1.4115 51000 0.0009 - - - -
1.4253 51500 0.0009 - - - -
1.4391 52000 0.0009 - - - -
1.4530 52500 0.0009 - - - -
1.4668 53000 0.0009 - - - -
1.4806 53500 0.0009 - - - -
1.4945 54000 0.0008 - - - -
1.5083 54500 0.0008 - - - -
1.5222 55000 0.0008 0.0004 0.7882 0.8426 0.8149
1.5360 55500 0.0008 - - - -
1.5498 56000 0.0008 - - - -
1.5637 56500 0.0008 - - - -
1.5775 57000 0.0008 - - - -
1.5913 57500 0.0008 - - - -
1.6052 58000 0.0008 - - - -
1.6190 58500 0.0008 - - - -
1.6329 59000 0.0008 - - - -
1.6467 59500 0.0008 - - - -
1.6605 60000 0.0008 0.0004 0.7969 0.8453 0.8242
1.6744 60500 0.0008 - - - -
1.6882 61000 0.0008 - - - -
1.7020 61500 0.0008 - - - -
1.7159 62000 0.0008 - - - -
1.7297 62500 0.0008 - - - -
1.7436 63000 0.0008 - - - -
1.7574 63500 0.0008 - - - -
1.7712 64000 0.0008 - - - -
1.7851 64500 0.0008 - - - -
1.7989 65000 0.0008 0.0004 0.7970 0.8473 0.8254
1.8127 65500 0.0008 - - - -
1.8266 66000 0.0008 - - - -
1.8404 66500 0.0008 - - - -
1.8543 67000 0.0008 - - - -
1.8681 67500 0.0008 - - - -
1.8819 68000 0.0008 - - - -
1.8958 68500 0.0008 - - - -
1.9096 69000 0.0008 - - - -
1.9234 69500 0.0008 - - - -
1.9373 70000 0.0008 0.0004 0.7959 0.8459 0.8234
1.9511 70500 0.0008 - - - -
1.9650 71000 0.0008 - - - -
1.9788 71500 0.0008 - - - -
1.9926 72000 0.0008 - - - -
2.0065 72500 0.0008 - - - -
2.0203 73000 0.0008 - - - -
2.0342 73500 0.0008 - - - -
2.0480 74000 0.0008 - - - -
2.0618 74500 0.0008 - - - -
2.0757 75000 0.0008 0.0004 0.7995 0.8489 0.8277
2.0895 75500 0.0008 - - - -
2.1033 76000 0.0008 - - - -
2.1172 76500 0.0008 - - - -
2.1310 77000 0.0008 - - - -
2.1449 77500 0.0008 - - - -
2.1587 78000 0.0008 - - - -
2.1725 78500 0.0008 - - - -
2.1864 79000 0.0008 - - - -
2.2002 79500 0.0008 - - - -
2.2140 80000 0.0008 0.0004 0.7994 0.8462 0.8237
2.2279 80500 0.0008 - - - -
2.2417 81000 0.0008 - - - -
2.2556 81500 0.0008 - - - -
2.2694 82000 0.0008 - - - -
2.2832 82500 0.0008 - - - -
2.2971 83000 0.0008 - - - -
2.3109 83500 0.0008 - - - -
2.3247 84000 0.0008 - - - -
2.3386 84500 0.0008 - - - -
2.3524 85000 0.0008 0.0004 0.8074 0.8499 0.8300
2.3663 85500 0.0008 - - - -
2.3801 86000 0.0008 - - - -
2.3939 86500 0.0008 - - - -
2.4078 87000 0.0008 - - - -
2.4216 87500 0.0008 - - - -
2.4354 88000 0.0008 - - - -
2.4493 88500 0.0008 - - - -
2.4631 89000 0.0008 - - - -
2.4770 89500 0.0008 - - - -
2.4908 90000 0.0008 0.0003 0.8088 0.8506 0.8315
2.5046 90500 0.0008 - - - -
2.5185 91000 0.0008 - - - -
2.5323 91500 0.0008 - - - -
2.5461 92000 0.0008 - - - -
2.5600 92500 0.0008 - - - -
2.5738 93000 0.0008 - - - -
2.5877 93500 0.0008 - - - -
2.6015 94000 0.0008 - - - -
2.6153 94500 0.0008 - - - -
2.6292 95000 0.0008 0.0003 0.8094 0.8518 0.8337
2.6430 95500 0.0008 - - - -
2.6569 96000 0.0008 - - - -
2.6707 96500 0.0008 - - - -
2.6845 97000 0.0008 - - - -
2.6984 97500 0.0008 - - - -
2.7122 98000 0.0008 - - - -
2.7260 98500 0.0008 - - - -
2.7399 99000 0.0008 - - - -
2.7537 99500 0.0008 - - - -
2.7676 100000 0.0008 0.0003 0.8083 0.8514 0.8303
2.7814 100500 0.0008 - - - -
2.7952 101000 0.0008 - - - -
2.8091 101500 0.0008 - - - -
2.8229 102000 0.0008 - - - -
2.8367 102500 0.0008 - - - -
2.8506 103000 0.0008 - - - -
2.8644 103500 0.0008 - - - -
2.8783 104000 0.0008 - - - -
2.8921 104500 0.0008 - - - -
2.9059 105000 0.0008 0.0003 0.8126 0.8521 0.8352
2.9198 105500 0.0008 - - - -
2.9336 106000 0.0008 - - - -
2.9474 106500 0.0008 - - - -
2.9613 107000 0.0008 - - - -
2.9751 107500 0.0008 - - - -
2.9890 108000 0.0008 - - - -
3.0028 108500 0.0008 - - - -
3.0166 109000 0.0008 - - - -
3.0305 109500 0.0008 - - - -
3.0443 110000 0.0008 0.0003 0.8149 0.8515 0.8340
3.0581 110500 0.0008 - - - -
3.0720 111000 0.0008 - - - -
3.0858 111500 0.0008 - - - -
3.0997 112000 0.0008 - - - -
3.1135 112500 0.0008 - - - -
3.1273 113000 0.0008 - - - -
3.1412 113500 0.0008 - - - -
3.1550 114000 0.0008 - - - -
3.1688 114500 0.0008 - - - -
3.1827 115000 0.0008 0.0003 0.8160 0.8527 0.8348
3.1965 115500 0.0008 - - - -
3.2104 116000 0.0008 - - - -
3.2242 116500 0.0008 - - - -
3.2380 117000 0.0008 - - - -
3.2519 117500 0.0008 - - - -
3.2657 118000 0.0008 - - - -
3.2796 118500 0.0008 - - - -
3.2934 119000 0.0008 - - - -
3.3072 119500 0.0008 - - - -
3.3211 120000 0.0008 0.0003 0.8176 0.8524 0.8359
3.3349 120500 0.0008 - - - -
3.3487 121000 0.0008 - - - -
3.3626 121500 0.0008 - - - -
3.3764 122000 0.0008 - - - -
3.3903 122500 0.0008 - - - -
3.4041 123000 0.0008 - - - -
3.4179 123500 0.0008 - - - -
3.4318 124000 0.0008 - - - -
3.4456 124500 0.0008 - - - -
3.4594 125000 0.0008 0.0003 0.8177 0.8541 0.8379
3.4733 125500 0.0008 - - - -
3.4871 126000 0.0008 - - - -
3.5010 126500 0.0008 - - - -
3.5148 127000 0.0008 - - - -
3.5286 127500 0.0008 - - - -
3.5425 128000 0.0008 - - - -
3.5563 128500 0.0008 - - - -
3.5701 129000 0.0008 - - - -
3.5840 129500 0.0008 - - - -
3.5978 130000 0.0008 0.0003 0.8162 0.8520 0.8371
3.6117 130500 0.0008 - - - -
3.6255 131000 0.0008 - - - -
3.6393 131500 0.0008 - - - -
3.6532 132000 0.0008 - - - -
3.6670 132500 0.0008 - - - -
3.6808 133000 0.0008 - - - -
3.6947 133500 0.0008 - - - -
3.7085 134000 0.0008 - - - -
3.7224 134500 0.0008 - - - -
3.7362 135000 0.0008 0.0003 0.8178 0.8542 0.8378
3.7500 135500 0.0008 - - - -
3.7639 136000 0.0008 - - - -
3.7777 136500 0.0008 - - - -
3.7915 137000 0.0008 - - - -
3.8054 137500 0.0008 - - - -
3.8192 138000 0.0008 - - - -
3.8331 138500 0.0008 - - - -
3.8469 139000 0.0008 - - - -
3.8607 139500 0.0008 - - - -
3.8746 140000 0.0008 0.0003 0.8214 0.8542 0.8408
3.8884 140500 0.0008 - - - -
3.9023 141000 0.0008 - - - -
3.9161 141500 0.0007 - - - -
3.9299 142000 0.0007 - - - -
3.9438 142500 0.0008 - - - -
3.9576 143000 0.0008 - - - -
3.9714 143500 0.0007 - - - -
3.9853 144000 0.0007 - - - -
3.9991 144500 0.0007 - - - -
4.0130 145000 0.0007 0.0003 0.8163 0.8521 0.8365
4.0268 145500 0.0007 - - - -
4.0406 146000 0.0007 - - - -
4.0545 146500 0.0007 - - - -
4.0683 147000 0.0007 - - - -
4.0821 147500 0.0007 - - - -
4.0960 148000 0.0007 - - - -
4.1098 148500 0.0007 - - - -
4.1237 149000 0.0007 - - - -
4.1375 149500 0.0007 - - - -
4.1513 150000 0.0007 0.0003 0.8183 0.8537 0.8374
4.1652 150500 0.0007 - - - -
4.1790 151000 0.0007 - - - -
4.1928 151500 0.0007 - - - -
4.2067 152000 0.0007 - - - -
4.2205 152500 0.0007 - - - -
4.2344 153000 0.0007 - - - -
4.2482 153500 0.0007 - - - -
4.2620 154000 0.0007 - - - -
4.2759 154500 0.0007 - - - -
4.2897 155000 0.0007 0.0003 0.8187 0.8525 0.8387
4.3035 155500 0.0007 - - - -
4.3174 156000 0.0007 - - - -
4.3312 156500 0.0007 - - - -
4.3451 157000 0.0007 - - - -
4.3589 157500 0.0007 - - - -
4.3727 158000 0.0007 - - - -
4.3866 158500 0.0007 - - - -
4.4004 159000 0.0007 - - - -
4.4142 159500 0.0007 - - - -
4.4281 160000 0.0007 0.0003 0.8152 0.8516 0.8359
4.4419 160500 0.0007 - - - -
4.4558 161000 0.0007 - - - -
4.4696 161500 0.0007 - - - -
4.4834 162000 0.0007 - - - -
4.4973 162500 0.0007 - - - -
4.5111 163000 0.0007 - - - -
4.5249 163500 0.0007 - - - -
4.5388 164000 0.0007 - - - -
4.5526 164500 0.0007 - - - -
4.5665 165000 0.0007 0.0003 0.8192 0.8532 0.8407
4.5803 165500 0.0007 - - - -
4.5941 166000 0.0007 - - - -
4.6080 166500 0.0007 - - - -
4.6218 167000 0.0007 - - - -
4.6357 167500 0.0007 - - - -
4.6495 168000 0.0007 - - - -
4.6633 168500 0.0007 - - - -
4.6772 169000 0.0007 - - - -
4.6910 169500 0.0007 - - - -
4.7048 170000 0.0007 0.0003 0.8205 0.8526 0.8393
4.7187 170500 0.0007 - - - -
4.7325 171000 0.0007 - - - -
4.7464 171500 0.0007 - - - -
4.7602 172000 0.0007 - - - -
4.7740 172500 0.0007 - - - -
4.7879 173000 0.0007 - - - -
4.8017 173500 0.0007 - - - -
4.8155 174000 0.0007 - - - -
4.8294 174500 0.0007 - - - -
4.8432 175000 0.0007 0.0003 0.8191 0.8524 0.8396
4.8571 175500 0.0007 - - - -
4.8709 176000 0.0007 - - - -
4.8847 176500 0.0007 - - - -
4.8986 177000 0.0007 - - - -
4.9124 177500 0.0007 - - - -
4.9262 178000 0.0007 - - - -
4.9401 178500 0.0007 - - - -
4.9539 179000 0.0007 - - - -
4.9678 179500 0.0007 - - - -
4.9816 180000 0.0007 0.0003 0.8202 0.8538 0.8426
4.9954 180500 0.0007 - - - -
5.0093 181000 0.0007 - - - -
5.0231 181500 0.0007 - - - -
5.0369 182000 0.0007 - - - -
5.0508 182500 0.0007 - - - -
5.0646 183000 0.0007 - - - -
5.0785 183500 0.0007 - - - -
5.0923 184000 0.0007 - - - -
5.1061 184500 0.0007 - - - -
5.1200 185000 0.0007 0.0003 0.8221 0.8548 0.8425
5.1338 185500 0.0007 - - - -
5.1476 186000 0.0007 - - - -
5.1615 186500 0.0007 - - - -
5.1753 187000 0.0007 - - - -
5.1892 187500 0.0007 - - - -
5.2030 188000 0.0007 - - - -
5.2168 188500 0.0007 - - - -
5.2307 189000 0.0007 - - - -
5.2445 189500 0.0007 - - - -
5.2584 190000 0.0007 0.0003 0.8205 0.8530 0.8401
5.2722 190500 0.0007 - - - -
5.2860 191000 0.0007 - - - -
5.2999 191500 0.0007 - - - -
5.3137 192000 0.0007 - - - -
5.3275 192500 0.0007 - - - -
5.3414 193000 0.0007 - - - -
5.3552 193500 0.0007 - - - -
5.3691 194000 0.0007 - - - -
5.3829 194500 0.0007 - - - -
5.3967 195000 0.0007 0.0003 0.8220 0.8526 0.8415
5.4106 195500 0.0007 - - - -
5.4244 196000 0.0007 - - - -
5.4382 196500 0.0007 - - - -
5.4521 197000 0.0007 - - - -
5.4659 197500 0.0007 - - - -
5.4798 198000 0.0007 - - - -
5.4936 198500 0.0007 - - - -
5.5074 199000 0.0007 - - - -
5.5213 199500 0.0007 - - - -
5.5351 200000 0.0007 0.0003 0.8187 0.8525 0.8395
5.5489 200500 0.0007 - - - -
5.5628 201000 0.0007 - - - -
5.5766 201500 0.0007 - - - -
5.5905 202000 0.0007 - - - -
5.6043 202500 0.0007 - - - -
5.6181 203000 0.0007 - - - -
5.6320 203500 0.0007 - - - -
5.6458 204000 0.0007 - - - -
5.6596 204500 0.0007 - - - -
5.6735 205000 0.0007 0.0003 0.8219 0.8531 0.8426
5.6873 205500 0.0007 - - - -
5.7012 206000 0.0007 - - - -
5.7150 206500 0.0007 - - - -
5.7288 207000 0.0007 - - - -
5.7427 207500 0.0007 - - - -
5.7565 208000 0.0007 - - - -
5.7703 208500 0.0007 - - - -
5.7842 209000 0.0007 - - - -
5.7980 209500 0.0007 - - - -
5.8119 210000 0.0007 0.0003 0.8226 0.8535 0.8413
5.8257 210500 0.0007 - - - -
5.8395 211000 0.0007 - - - -
5.8534 211500 0.0007 - - - -
5.8672 212000 0.0007 - - - -
5.8811 212500 0.0007 - - - -
5.8949 213000 0.0007 - - - -
5.9087 213500 0.0007 - - - -
5.9226 214000 0.0007 - - - -
5.9364 214500 0.0007 - - - -
5.9502 215000 0.0007 0.0003 0.8223 0.8542 0.8416
5.9641 215500 0.0007 - - - -
5.9779 216000 0.0007 - - - -
5.9918 216500 0.0007 - - - -
6.0056 217000 0.0007 - - - -
6.0194 217500 0.0007 - - - -
6.0333 218000 0.0007 - - - -
6.0471 218500 0.0007 - - - -
6.0609 219000 0.0007 - - - -
6.0748 219500 0.0007 - - - -
6.0886 220000 0.0007 0.0003 0.8215 0.8538 0.8416
6.1025 220500 0.0007 - - - -
6.1163 221000 0.0007 - - - -
6.1301 221500 0.0007 - - - -
6.1440 222000 0.0007 - - - -
6.1578 222500 0.0007 - - - -
6.1716 223000 0.0007 - - - -
6.1855 223500 0.0007 - - - -
6.1993 224000 0.0007 - - - -
6.2132 224500 0.0007 - - - -
6.2270 225000 0.0007 0.0003 0.8243 0.8545 0.8415
6.2408 225500 0.0007 - - - -
6.2547 226000 0.0007 - - - -
6.2685 226500 0.0007 - - - -
6.2823 227000 0.0007 - - - -
6.2962 227500 0.0007 - - - -
6.3100 228000 0.0007 - - - -
6.3239 228500 0.0007 - - - -
6.3377 229000 0.0007 - - - -
6.3515 229500 0.0007 - - - -
6.3654 230000 0.0007 0.0003 0.8234 0.8539 0.8418
6.3792 230500 0.0007 - - - -
6.3930 231000 0.0007 - - - -
6.4069 231500 0.0007 - - - -
6.4207 232000 0.0007 - - - -
6.4346 232500 0.0007 - - - -
6.4484 233000 0.0007 - - - -
6.4622 233500 0.0007 - - - -
6.4761 234000 0.0007 - - - -
6.4899 234500 0.0007 - - - -
6.5038 235000 0.0007 0.0003 0.8217 0.8537 0.8410
6.5176 235500 0.0007 - - - -
6.5314 236000 0.0007 - - - -
6.5453 236500 0.0007 - - - -
6.5591 237000 0.0007 - - - -
6.5729 237500 0.0007 - - - -
6.5868 238000 0.0007 - - - -
6.6006 238500 0.0007 - - - -
6.6145 239000 0.0007 - - - -
6.6283 239500 0.0007 - - - -
6.6421 240000 0.0007 0.0003 0.8239 0.8537 0.8434
6.6560 240500 0.0007 - - - -
6.6698 241000 0.0007 - - - -
6.6836 241500 0.0007 - - - -
6.6975 242000 0.0007 - - - -
6.7113 242500 0.0007 - - - -
6.7252 243000 0.0007 - - - -
6.7390 243500 0.0007 - - - -
6.7528 244000 0.0007 - - - -
6.7667 244500 0.0007 - - - -
6.7805 245000 0.0007 0.0003 0.8233 0.8534 0.8431
6.7943 245500 0.0007 - - - -
6.8082 246000 0.0007 - - - -
6.8220 246500 0.0007 - - - -
6.8359 247000 0.0007 - - - -
6.8497 247500 0.0007 - - - -
6.8635 248000 0.0007 - - - -
6.8774 248500 0.0007 - - - -
6.8912 249000 0.0007 - - - -
6.9050 249500 0.0007 - - - -
6.9189 250000 0.0007 0.0003 0.8239 0.8543 0.8432
6.9327 250500 0.0007 - - - -
6.9466 251000 0.0007 - - - -
6.9604 251500 0.0007 - - - -
6.9742 252000 0.0007 - - - -
6.9881 252500 0.0007 - - - -
7.0019 253000 0.0007 - - - -
7.0157 253500 0.0007 - - - -
7.0296 254000 0.0007 - - - -
7.0434 254500 0.0007 - - - -
7.0573 255000 0.0007 0.0003 0.8242 0.8541 0.8429
7.0711 255500 0.0007 - - - -
7.0849 256000 0.0007 - - - -
7.0988 256500 0.0007 - - - -
7.1126 257000 0.0007 - - - -
7.1264 257500 0.0007 - - - -
7.1403 258000 0.0007 - - - -
7.1541 258500 0.0007 - - - -
7.1680 259000 0.0007 - - - -
7.1818 259500 0.0007 - - - -
7.1956 260000 0.0007 0.0003 0.8236 0.8537 0.8418
7.2095 260500 0.0007 - - - -
7.2233 261000 0.0007 - - - -
7.2372 261500 0.0007 - - - -
7.2510 262000 0.0007 - - - -
7.2648 262500 0.0007 - - - -
7.2787 263000 0.0007 - - - -
7.2925 263500 0.0007 - - - -
7.3063 264000 0.0007 - - - -
7.3202 264500 0.0007 - - - -
7.3340 265000 0.0007 0.0003 0.8245 0.8536 0.8420
7.3479 265500 0.0007 - - - -
7.3617 266000 0.0007 - - - -
7.3755 266500 0.0007 - - - -
7.3894 267000 0.0007 - - - -
7.4032 267500 0.0007 - - - -
7.4170 268000 0.0007 - - - -
7.4309 268500 0.0007 - - - -
7.4447 269000 0.0007 - - - -
7.4586 269500 0.0007 - - - -
7.4724 270000 0.0007 0.0003 0.8253 0.8545 0.8424
7.4862 270500 0.0007 - - - -
7.5001 271000 0.0007 - - - -
7.5139 271500 0.0007 - - - -
7.5277 272000 0.0007 - - - -
7.5416 272500 0.0007 - - - -
7.5554 273000 0.0007 - - - -
7.5693 273500 0.0007 - - - -
7.5831 274000 0.0007 - - - -
7.5969 274500 0.0007 - - - -
7.6108 275000 0.0007 0.0003 0.8233 0.8534 0.8427
7.6246 275500 0.0007 - - - -
7.6384 276000 0.0007 - - - -
7.6523 276500 0.0007 - - - -
7.6661 277000 0.0007 - - - -
7.6800 277500 0.0007 - - - -
7.6938 278000 0.0007 - - - -
7.7076 278500 0.0007 - - - -
7.7215 279000 0.0007 - - - -
7.7353 279500 0.0007 - - - -
7.7491 280000 0.0007 0.0003 0.8242 0.8541 0.8428
7.7630 280500 0.0007 - - - -
7.7768 281000 0.0007 - - - -
7.7907 281500 0.0007 - - - -
7.8045 282000 0.0007 - - - -
7.8183 282500 0.0007 - - - -
7.8322 283000 0.0007 - - - -
7.8460 283500 0.0007 - - - -
7.8599 284000 0.0007 - - - -
7.8737 284500 0.0007 - - - -
7.8875 285000 0.0007 0.0003 0.8245 0.8542 0.8434
7.9014 285500 0.0007 - - - -
7.9152 286000 0.0007 - - - -
7.9290 286500 0.0007 - - - -
7.9429 287000 0.0007 - - - -
7.9567 287500 0.0007 - - - -
7.9706 288000 0.0007 - - - -
7.9844 288500 0.0007 - - - -
7.9982 289000 0.0007 - - - -

Framework Versions

  • Python: 3.10.16
  • Sentence Transformers: 3.3.1
  • Transformers: 4.51.3
  • PyTorch: 2.5.1+cu124
  • Accelerate: 1.2.1
  • Datasets: 3.2.0
  • Tokenizers: 0.21.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MSELoss

@inproceedings{reimers-2020-multilingual-sentence-bert,
    title = "Making Monolingual Sentence Embeddings Multilingual using Knowledge Distillation",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2020",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/2004.09813",
}
Downloads last month
2,153
Safetensors
Model size
22.7M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for jvanhoof/all-MiniLM-L6-multilingual-v2-en-es-pt-pt-br-v2

Finetuned
(416)
this model

Evaluation results