all-MiniLM-L6-v8-pair_score

This is a sentence-transformers model finetuned from sentence-transformers/all-MiniLM-L6-v2. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: sentence-transformers/all-MiniLM-L6-v2
  • Maximum Sequence Length: 256 tokens
  • Output Dimensionality: 384 tokens
  • Similarity Function: Cosine Similarity
  • Language: en
  • License: apache-2.0

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the ๐Ÿค— Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
    'deluxe mug for morning coffee',
    'polyfibre scarf',
    'cheddar cheese burrito',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Training Details

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • learning_rate: 2e-05
  • num_train_epochs: 1
  • warmup_ratio: 0.1
  • fp16: True

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 1
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss
0.0014 100 11.4553
0.0028 200 11.5147
0.0042 300 11.1304
0.0056 400 10.7925
0.0070 500 10.4493
0.0084 600 10.2631
0.0098 700 9.987
0.0112 800 9.8477
0.0126 900 9.6295
0.0140 1000 9.3638
0.0153 1100 9.1913
0.0167 1200 8.9688
0.0181 1300 8.808
0.0195 1400 8.6993
0.0209 1500 8.6078
0.0223 1600 8.5739
0.0237 1700 8.5575
0.0251 1800 8.5173
0.0265 1900 8.4983
0.0279 2000 8.4662
0.0293 2100 8.4408
0.0307 2200 8.4136
0.0321 2300 8.4002
0.0335 2400 8.3883
0.0349 2500 8.3785
0.0363 2600 8.3458
0.0377 2700 8.3617
0.0391 2800 8.3338
0.0405 2900 8.3281
0.0419 3000 8.3043
0.0433 3100 8.3087
0.0447 3200 8.2913
0.0460 3300 8.2854
0.0474 3400 8.2408
0.0488 3500 8.2628
0.0502 3600 8.2401
0.0516 3700 8.2538
0.0530 3800 8.2103
0.0544 3900 8.2221
0.0558 4000 8.2248
0.0572 4100 8.2045
0.0586 4200 8.2008
0.0600 4300 8.196
0.0614 4400 8.1757
0.0628 4500 8.1845
0.0642 4600 8.1714
0.0656 4700 8.1745
0.0670 4800 8.1702
0.0684 4900 8.1767
0.0698 5000 8.1379
0.0712 5100 8.1473
0.0726 5200 8.1443
0.0740 5300 8.1173
0.0754 5400 8.121
0.0767 5500 8.136
0.0781 5600 8.1246
0.0795 5700 8.0983
0.0809 5800 8.1023
0.0823 5900 8.1013
0.0837 6000 8.0657
0.0851 6100 8.0998
0.0865 6200 8.0585
0.0879 6300 8.1082
0.0893 6400 8.0652
0.0907 6500 8.0808
0.0921 6600 8.0756
0.0935 6700 8.0279
0.0949 6800 8.0659
0.0963 6900 8.0428
0.0977 7000 8.0363
0.0991 7100 8.0343
0.1005 7200 8.0488
0.1019 7300 8.0225
0.1033 7400 8.0203
0.1047 7500 8.0248
0.1061 7600 7.9882
0.1074 7700 7.9956
0.1088 7800 8.0338
0.1102 7900 7.9827
0.1116 8000 7.9849
0.1130 8100 8.0072
0.1144 8200 7.9708
0.1158 8300 7.9786
0.1172 8400 7.9983
0.1186 8500 7.9762
0.1200 8600 7.9955
0.1214 8700 7.9969
0.1228 8800 7.9913
0.1242 8900 7.9512
0.1256 9000 7.9672
0.1270 9100 7.9853
0.1284 9200 7.9626
0.1298 9300 7.9767
0.1312 9400 7.9404
0.1326 9500 7.9076
0.1340 9600 7.968
0.1354 9700 7.9432
0.1367 9800 7.9255
0.1381 9900 7.9095
0.1395 10000 7.9337
0.1409 10100 7.9464
0.1423 10200 7.9218
0.1437 10300 7.9102
0.1451 10400 7.9379
0.1465 10500 7.8907
0.1479 10600 7.8968
0.1493 10700 7.9193
0.1507 10800 7.9327
0.1521 10900 7.896
0.1535 11000 7.9228
0.1549 11100 7.9253
0.1563 11200 7.8825
0.1577 11300 7.8812
0.1591 11400 7.8883
0.1605 11500 7.8721
0.1619 11600 7.9218
0.1633 11700 7.8893
0.1647 11800 7.8961
0.1661 11900 7.8647
0.1674 12000 7.89
0.1688 12100 7.8422
0.1702 12200 7.9348
0.1716 12300 7.8808
0.1730 12400 7.8788
0.1744 12500 7.8794
0.1758 12600 7.848
0.1772 12700 7.8279
0.1786 12800 7.8655
0.1800 12900 7.8612
0.1814 13000 7.828
0.1828 13100 7.8419
0.1842 13200 7.8574
0.1856 13300 7.8688
0.1870 13400 7.8408
0.1884 13500 7.8172
0.1898 13600 7.8579
0.1912 13700 7.8392
0.1926 13800 7.849
0.1940 13900 7.8485
0.1954 14000 7.861
0.1968 14100 7.8257
0.1981 14200 7.8647
0.1995 14300 7.857
0.2009 14400 7.8031
0.2023 14500 7.8498
0.2037 14600 7.8175
0.2051 14700 7.8474
0.2065 14800 7.8158
0.2079 14900 7.7777
0.2093 15000 7.8362
0.2107 15100 7.8387
0.2121 15200 7.8225
0.2135 15300 7.8627
0.2149 15400 7.8543
0.2163 15500 7.8096
0.2177 15600 7.8201
0.2191 15700 7.8178
0.2205 15800 7.8138
0.2219 15900 7.8384
0.2233 16000 7.7811
0.2247 16100 7.82
0.2261 16200 7.7731
0.2275 16300 7.8508
0.2288 16400 7.8087
0.2302 16500 7.7959
0.2316 16600 7.7857
0.2330 16700 7.7946
0.2344 16800 7.7884
0.2358 16900 7.8226
0.2372 17000 7.7811
0.2386 17100 7.778
0.2400 17200 7.7825
0.2414 17300 7.782
0.2428 17400 7.8164
0.2442 17500 7.7514
0.2456 17600 7.7744
0.2470 17700 7.7974
0.2484 17800 7.7913
0.2498 17900 7.757
0.2512 18000 7.7724
0.2526 18100 7.7772
0.2540 18200 7.7723
0.2554 18300 7.753
0.2568 18400 7.8055
0.2581 18500 7.7878
0.2595 18600 7.7822
0.2609 18700 7.7923
0.2623 18800 7.8378
0.2637 18900 7.8226
0.2651 19000 7.8015
0.2665 19100 7.7355
0.2679 19200 7.789
0.2693 19300 7.7473
0.2707 19400 7.7521
0.2721 19500 7.7867
0.2735 19600 7.7597
0.2749 19700 7.7506
0.2763 19800 7.732
0.2777 19900 7.7288
0.2791 20000 7.7317
0.2805 20100 7.7495
0.2819 20200 7.7236
0.2833 20300 7.7489
0.2847 20400 7.7592
0.2861 20500 7.7455
0.2875 20600 7.7623
0.2888 20700 7.7774
0.2902 20800 7.7485
0.2916 20900 7.7043
0.2930 21000 7.8039
0.2944 21100 7.7383
0.2958 21200 7.759
0.2972 21300 7.7362
0.2986 21400 7.7788
0.3000 21500 7.7244
0.3014 21600 7.72
0.3028 21700 7.7453
0.3042 21800 7.729
0.3056 21900 7.7735
0.3070 22000 7.7185
0.3084 22100 7.7641
0.3098 22200 7.7293
0.3112 22300 7.7401
0.3126 22400 7.725
0.3140 22500 7.7315
0.3154 22600 7.716
0.3168 22700 7.7576
0.3182 22800 7.7088
0.3195 22900 7.7428
0.3209 23000 7.7266
0.3223 23100 7.7246
0.3237 23200 7.7084
0.3251 23300 7.7094
0.3265 23400 7.7081
0.3279 23500 7.7472
0.3293 23600 7.7581
0.3307 23700 7.7264
0.3321 23800 7.7262
0.3335 23900 7.7252
0.3349 24000 7.7219
0.3363 24100 7.706
0.3377 24200 7.7372
0.3391 24300 7.6965
0.3405 24400 7.6865
0.3419 24500 7.6798
0.3433 24600 7.6962
0.3447 24700 7.701
0.3461 24800 7.6722
0.3475 24900 7.7453
0.3489 25000 7.6463
0.3502 25100 7.7256
0.3516 25200 7.693
0.3530 25300 7.7306
0.3544 25400 7.7037
0.3558 25500 7.6733
0.3572 25600 7.7202
0.3586 25700 7.6866
0.3600 25800 7.715
0.3614 25900 7.6925
0.3628 26000 7.6961
0.3642 26100 7.6752
0.3656 26200 7.7377
0.3670 26300 7.6744
0.3684 26400 7.6698
0.3698 26500 7.6931
0.3712 26600 7.6789
0.3726 26700 7.6736
0.3740 26800 7.6918
0.3754 26900 7.7129
0.3768 27000 7.7179
0.3782 27100 7.6747
0.3795 27200 7.6809
0.3809 27300 7.6803
0.3823 27400 7.6777
0.3837 27500 7.6702
0.3851 27600 7.7005
0.3865 27700 7.6671
0.3879 27800 7.6873
0.3893 27900 7.6919
0.3907 28000 7.6987
0.3921 28100 7.6641
0.3935 28200 7.6449
0.3949 28300 7.6715
0.3963 28400 7.6672
0.3977 28500 7.6796
0.3991 28600 7.7085
0.4005 28700 7.6557
0.4019 28800 7.6592
0.4033 28900 7.6695
0.4047 29000 7.6734
0.4061 29100 7.6499
0.4075 29200 7.6472
0.4089 29300 7.6705
0.4102 29400 7.6856
0.4116 29500 7.6474
0.4130 29600 7.6581
0.4144 29700 7.6699
0.4158 29800 7.6693
0.4172 29900 7.6716
0.4186 30000 7.6594
0.4200 30100 7.6391
0.4214 30200 7.6758
0.4228 30300 7.652
0.4242 30400 7.6312
0.4256 30500 7.6538
0.4270 30600 7.6959
0.4284 30700 7.7324
0.4298 30800 7.6529
0.4312 30900 7.6528
0.4326 31000 7.7036
0.4340 31100 7.6794
0.4354 31200 7.6603
0.4368 31300 7.6372
0.4382 31400 7.6427
0.4396 31500 7.6852
0.4409 31600 7.6987
0.4423 31700 7.6385
0.4437 31800 7.701
0.4451 31900 7.6702
0.4465 32000 7.6551
0.4479 32100 7.6464
0.4493 32200 7.667
0.4507 32300 7.628
0.4521 32400 7.7012
0.4535 32500 7.6333
0.4549 32600 7.6707
0.4563 32700 7.6304
0.4577 32800 7.6719
0.4591 32900 7.6744
0.4605 33000 7.7102
0.4619 33100 7.6918
0.4633 33200 7.7018
0.4647 33300 7.6131
0.4661 33400 7.6476
0.4675 33500 7.6594
0.4689 33600 7.6301
0.4703 33700 7.6134
0.4716 33800 7.7383
0.4730 33900 7.6253
0.4744 34000 7.662
0.4758 34100 7.6341
0.4772 34200 7.6622
0.4786 34300 7.6429
0.4800 34400 7.6777
0.4814 34500 7.6089
0.4828 34600 7.6382
0.4842 34700 7.6324
0.4856 34800 7.6176
0.4870 34900 7.624
0.4884 35000 7.6163
0.4898 35100 7.6503
0.4912 35200 7.6609
0.4926 35300 7.6587
0.4940 35400 7.5999
0.4954 35500 7.586
0.4968 35600 7.6585
0.4982 35700 7.7349
0.4996 35800 7.642
0.5009 35900 7.646
0.5023 36000 7.5942
0.5037 36100 7.6477
0.5051 36200 7.6259
0.5065 36300 7.5926
0.5079 36400 7.6166
0.5093 36500 7.6323
0.5107 36600 7.6324
0.5121 36700 7.6411
0.5135 36800 7.6343
0.5149 36900 7.6313
0.5163 37000 7.6187
0.5177 37100 7.6545
0.5191 37200 7.6555
0.5205 37300 7.6984
0.5219 37400 7.6638
0.5233 37500 7.6093
0.5247 37600 7.5925
0.5261 37700 7.6281
0.5275 37800 7.6349
0.5289 37900 7.6152
0.5303 38000 7.6531
0.5316 38100 7.6078
0.5330 38200 7.6775
0.5344 38300 7.6268
0.5358 38400 7.641
0.5372 38500 7.6721
0.5386 38600 7.6069
0.5400 38700 7.6174
0.5414 38800 7.6407
0.5428 38900 7.6226
0.5442 39000 7.5843
0.5456 39100 7.6588
0.5470 39200 7.6405
0.5484 39300 7.5908
0.5498 39400 7.6203
0.5512 39500 7.608
0.5526 39600 7.6177
0.5540 39700 7.606
0.5554 39800 7.7102
0.5568 39900 7.6252
0.5582 40000 7.6235
0.5596 40100 7.6325
0.5610 40200 7.6146
0.5623 40300 7.6386
0.5637 40400 7.6189
0.5651 40500 7.638
0.5665 40600 7.5859
0.5679 40700 7.5737
0.5693 40800 7.6331
0.5707 40900 7.6265
0.5721 41000 7.6475
0.5735 41100 7.5966
0.5749 41200 7.6331
0.5763 41300 7.5655
0.5777 41400 7.6727
0.5791 41500 7.5972
0.5805 41600 7.5911
0.5819 41700 7.6734
0.5833 41800 7.6528
0.5847 41900 7.6063
0.5861 42000 7.6496
0.5875 42100 7.6225
0.5889 42200 7.6863
0.5903 42300 7.6145
0.5916 42400 7.6072
0.5930 42500 7.625
0.5944 42600 7.6087
0.5958 42700 7.6622
0.5972 42800 7.5619
0.5986 42900 7.6563
0.6000 43000 7.5958
0.6014 43100 7.6107
0.6028 43200 7.6208
0.6042 43300 7.5973
0.6056 43400 7.5928
0.6070 43500 7.637
0.6084 43600 7.5659
0.6098 43700 7.5921
0.6112 43800 7.5961
0.6126 43900 7.5614
0.6140 44000 7.6366
0.6154 44100 7.5947
0.6168 44200 7.5976
0.6182 44300 7.6406
0.6196 44400 7.585
0.6210 44500 7.5722
0.6223 44600 7.6193
0.6237 44700 7.6249
0.6251 44800 7.6208
0.6265 44900 7.6293
0.6279 45000 7.6023
0.6293 45100 7.5996
0.6307 45200 7.5553
0.6321 45300 7.5996
0.6335 45400 7.5994
0.6349 45500 7.6691
0.6363 45600 7.6051
0.6377 45700 7.6589
0.6391 45800 7.6217
0.6405 45900 7.6053
0.6419 46000 7.6082
0.6433 46100 7.5913
0.6447 46200 7.5742
0.6461 46300 7.597
0.6475 46400 7.5759
0.6489 46500 7.5964
0.6503 46600 7.6719
0.6517 46700 7.605
0.6530 46800 7.5705
0.6544 46900 7.6292
0.6558 47000 7.5978
0.6572 47100 7.5525
0.6586 47200 7.5838
0.6600 47300 7.5672
0.6614 47400 7.6041
0.6628 47500 7.6255
0.6642 47600 7.5415
0.6656 47700 7.61
0.6670 47800 7.573
0.6684 47900 7.6413
0.6698 48000 7.6277
0.6712 48100 7.5903
0.6726 48200 7.6542
0.6740 48300 7.5772
0.6754 48400 7.5991
0.6768 48500 7.5853
0.6782 48600 7.5909
0.6796 48700 7.5912
0.6810 48800 7.6052
0.6824 48900 7.632
0.6837 49000 7.5851
0.6851 49100 7.6688
0.6865 49200 7.6091
0.6879 49300 7.5745
0.6893 49400 7.5833
0.6907 49500 7.5777
0.6921 49600 7.5637
0.6935 49700 7.5622
0.6949 49800 7.5633
0.6963 49900 7.6023
0.6977 50000 7.6103
0.6991 50100 7.547
0.7005 50200 7.5907
0.7019 50300 7.5882
0.7033 50400 7.5875
0.7047 50500 7.5909
0.7061 50600 7.6021
0.7075 50700 7.549
0.7089 50800 7.6511
0.7103 50900 7.6606
0.7117 51000 7.5967
0.7130 51100 7.5722
0.7144 51200 7.6129
0.7158 51300 7.5736
0.7172 51400 7.5799
0.7186 51500 7.6209
0.7200 51600 7.595
0.7214 51700 7.5484
0.7228 51800 7.5999
0.7242 51900 7.5638
0.7256 52000 7.5654
0.7270 52100 7.6303
0.7284 52200 7.5485
0.7298 52300 7.676
0.7312 52400 7.6376
0.7326 52500 7.557
0.7340 52600 7.5631
0.7354 52700 7.6637
0.7368 52800 7.588
0.7382 52900 7.5771
0.7396 53000 7.5766
0.7410 53100 7.5731
0.7424 53200 7.508
0.7437 53300 7.6023
0.7451 53400 7.5796
0.7465 53500 7.5593
0.7479 53600 7.5516
0.7493 53700 7.5973
0.7507 53800 7.5868
0.7521 53900 7.623
0.7535 54000 7.5972
0.7549 54100 7.6304
0.7563 54200 7.5927
0.7577 54300 7.5351
0.7591 54400 7.5732
0.7605 54500 7.6676
0.7619 54600 7.6103
0.7633 54700 7.5572
0.7647 54800 7.574
0.7661 54900 7.555
0.7675 55000 7.6347
0.7689 55100 7.5827
0.7703 55200 7.678
0.7717 55300 7.5577
0.7731 55400 7.5606
0.7744 55500 7.5284
0.7758 55600 7.5561
0.7772 55700 7.6569
0.7786 55800 7.5604
0.7800 55900 7.6444
0.7814 56000 7.602
0.7828 56100 7.5532
0.7842 56200 7.5524
0.7856 56300 7.654
0.7870 56400 7.5799
0.7884 56500 7.5609
0.7898 56600 7.5625
0.7912 56700 7.571
0.7926 56800 7.5126
0.7940 56900 7.5644
0.7954 57000 7.5508
0.7968 57100 7.5183
0.7982 57200 7.5749
0.7996 57300 7.5339
0.8010 57400 7.5739
0.8024 57500 7.5492
0.8038 57600 7.5781
0.8051 57700 7.5753
0.8065 57800 7.5485
0.8079 57900 7.5608
0.8093 58000 7.5515
0.8107 58100 7.6011
0.8121 58200 7.6072
0.8135 58300 7.5615
0.8149 58400 7.5583
0.8163 58500 7.5423
0.8177 58600 7.5852
0.8191 58700 7.5612
0.8205 58800 7.5808
0.8219 58900 7.5888
0.8233 59000 7.6449
0.8247 59100 7.6599
0.8261 59200 7.573
0.8275 59300 7.5533
0.8289 59400 7.5423
0.8303 59500 7.5879
0.8317 59600 7.5699
0.8331 59700 7.5792
0.8344 59800 7.5552
0.8358 59900 7.5982
0.8372 60000 7.5984
0.8386 60100 7.5383
0.8400 60200 7.5518
0.8414 60300 7.5587
0.8428 60400 7.5152
0.8442 60500 7.5945
0.8456 60600 7.5674
0.8470 60700 7.5527
0.8484 60800 7.5941
0.8498 60900 7.5964
0.8512 61000 7.5625
0.8526 61100 7.5526
0.8540 61200 7.5592
0.8554 61300 7.5593
0.8568 61400 7.5392
0.8582 61500 7.641
0.8596 61600 7.6258
0.8610 61700 7.6588
0.8624 61800 7.5707
0.8638 61900 7.5171
0.8651 62000 7.6107
0.8665 62100 7.6272
0.8679 62200 7.5549
0.8693 62300 7.5535
0.8707 62400 7.6454
0.8721 62500 7.5498
0.8735 62600 7.5898
0.8749 62700 7.5461
0.8763 62800 7.5611
0.8777 62900 7.6068
0.8791 63000 7.6001
0.8805 63100 7.5407
0.8819 63200 7.5961
0.8833 63300 7.5839
0.8847 63400 7.5426
0.8861 63500 7.6011
0.8875 63600 7.5708
0.8889 63700 7.5964
0.8903 63800 7.5704
0.8917 63900 7.5372
0.8931 64000 7.5835
0.8945 64100 7.5483
0.8958 64200 7.544
0.8972 64300 7.5677
0.8986 64400 7.5636
0.9000 64500 7.5914
0.9014 64600 7.5789
0.9028 64700 7.5666
0.9042 64800 7.5866
0.9056 64900 7.6195
0.9070 65000 7.5388
0.9084 65100 7.5821
0.9098 65200 7.6767
0.9112 65300 7.6625
0.9126 65400 7.5812
0.9140 65500 7.5026
0.9154 65600 7.5524
0.9168 65700 7.5851
0.9182 65800 7.5762
0.9196 65900 7.5466
0.9210 66000 7.6039
0.9224 66100 7.6041
0.9238 66200 7.5805
0.9252 66300 7.6334
0.9265 66400 7.5348
0.9279 66500 7.6065
0.9293 66600 7.5003
0.9307 66700 7.5512
0.9321 66800 7.5404
0.9335 66900 7.6176
0.9349 67000 7.5634
0.9363 67100 7.5786
0.9377 67200 7.6327
0.9391 67300 7.5532
0.9405 67400 7.5362
0.9419 67500 7.5844
0.9433 67600 7.5632
0.9447 67700 7.553
0.9461 67800 7.5422
0.9475 67900 7.5483
0.9489 68000 7.5477
0.9503 68100 7.5423
0.9517 68200 7.5656
0.9531 68300 7.5573
0.9545 68400 7.525
0.9558 68500 7.55
0.9572 68600 7.5341
0.9586 68700 7.5318
0.9600 68800 7.5691
0.9614 68900 7.5793
0.9628 69000 7.5615
0.9642 69100 7.5348
0.9656 69200 7.5384
0.9670 69300 7.5392
0.9684 69400 7.5909
0.9698 69500 7.5587
0.9712 69600 7.5447
0.9726 69700 7.5731
0.9740 69800 7.5767
0.9754 69900 7.6208
0.9768 70000 7.5414
0.9782 70100 7.6061
0.9796 70200 7.6285
0.9810 70300 7.5533
0.9824 70400 7.5552
0.9838 70500 7.5479
0.9852 70600 7.571
0.9865 70700 7.6259
0.9879 70800 7.6366
0.9893 70900 7.5615
0.9907 71000 7.612
0.9921 71100 7.5309
0.9935 71200 7.5122
0.9949 71300 7.5692
0.9963 71400 7.6198
0.9977 71500 7.527
0.9991 71600 7.5496

Framework Versions

  • Python: 3.8.10
  • Sentence Transformers: 3.1.1
  • Transformers: 4.45.2
  • PyTorch: 2.4.1+cu118
  • Accelerate: 1.0.1
  • Datasets: 3.0.1
  • Tokenizers: 0.20.3

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

CoSENTLoss

@online{kexuefm-8847,
    title={CoSENT: A more efficient sentence vector scheme than Sentence-BERT},
    author={Su Jianlin},
    year={2022},
    month={Jan},
    url={https://kexue.fm/archives/8847},
}
Downloads last month
4
Safetensors
Model size
22.7M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Remonatef/all-MiniLM-L6-v17-pair_score

Finetuned
(549)
this model