SentenceTransformer based on sentence-transformers/LaBSE

This is a sentence-transformers model finetuned from sentence-transformers/LaBSE. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: sentence-transformers/LaBSE
  • Maximum Sequence Length: 256 tokens
  • Output Dimensionality: 768 dimensions
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Dense({'in_features': 768, 'out_features': 768, 'bias': True, 'activation_function': 'torch.nn.modules.activation.Tanh'})
  (3): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
    'Генри Джастис Форд',
    'Форд, Генри Джастис',
    'Я вышел из ванны свеж и бодр, как будто собирался на бал.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Training Details

Training Dataset

Unnamed Dataset

  • Size: 1,000,000 training samples
  • Columns: sentence_0, sentence_1, and label
  • Approximate statistics based on the first 1000 samples:
    sentence_0 sentence_1 label
    type string string float
    details
    • min: 3 tokens
    • mean: 21.82 tokens
    • max: 127 tokens
    • min: 4 tokens
    • mean: 21.16 tokens
    • max: 136 tokens
    • min: 1.0
    • mean: 1.0
    • max: 1.0
  • Samples:
    sentence_0 sentence_1 label
    Темех мар. Дело десятое. 1.0
    Уругвайӑн тĕн ĕҫченĕсем Религиозные деятели Уругвая 1.0
    Эп аванах ас тӑватӑп, пилӗк ҫул каялла пахчана эпир лайӑх тасатнӑччӗ. А пять лет тому назад я знал, что сад был чищен. 1.0
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 12
  • per_device_eval_batch_size: 12
  • num_train_epochs: 1
  • fp16: True
  • multi_dataset_batch_sampler: round_robin

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 12
  • per_device_eval_batch_size: 12
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1
  • num_train_epochs: 1
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.0
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • tp_size: 0
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: round_robin

Training Logs

Click to expand
Epoch Step Training Loss
0.0012 100 -
0.0024 200 -
0.0036 300 -
0.0048 400 -
0.0060 500 0.5331
0.0072 600 -
0.0084 700 -
0.0096 800 -
0.0108 900 -
0.0120 1000 0.3694
0.0132 1100 -
0.0144 1200 -
0.0156 1300 -
0.0168 1400 -
0.0180 1500 0.3141
0.0192 1600 -
0.0204 1700 -
0.0216 1800 -
0.0228 1900 -
0.0240 2000 0.2836
0.0252 2100 -
0.0264 2200 -
0.0276 2300 -
0.0288 2400 -
0.0300 2500 0.2823
0.0312 2600 -
0.0324 2700 -
0.0336 2800 -
0.0348 2900 -
0.0360 3000 0.265
0.0372 3100 -
0.0384 3200 -
0.0396 3300 -
0.0408 3400 -
0.0420 3500 0.2599
0.0432 3600 -
0.0444 3700 -
0.0456 3800 -
0.0468 3900 -
0.0480 4000 0.234
0.0492 4100 -
0.0504 4200 -
0.0516 4300 -
0.0528 4400 -
0.0540 4500 0.1966
0.0552 4600 -
0.0564 4700 -
0.0576 4800 -
0.0588 4900 -
0.0600 5000 0.2204
0.0612 5100 -
0.0624 5200 -
0.0636 5300 -
0.0648 5400 -
0.0660 5500 0.2272
0.0672 5600 -
0.0684 5700 -
0.0696 5800 -
0.0708 5900 -
0.0720 6000 0.2256
0.0732 6100 -
0.0744 6200 -
0.0756 6300 -
0.0768 6400 -
0.0780 6500 0.2071
0.0792 6600 -
0.0804 6700 -
0.0816 6800 -
0.0828 6900 -
0.0840 7000 0.2113
0.0852 7100 -
0.0864 7200 -
0.0876 7300 -
0.0888 7400 -
0.0900 7500 0.2222
0.0912 7600 -
0.0924 7700 -
0.0936 7800 -
0.0948 7900 -
0.0960 8000 0.2186
0.0972 8100 -
0.0984 8200 -
0.0996 8300 -
0.1008 8400 -
0.1020 8500 0.2137
0.1032 8600 -
0.1044 8700 -
0.1056 8800 -
0.1068 8900 -
0.1080 9000 0.1928
0.1092 9100 -
0.1104 9200 -
0.1116 9300 -
0.1128 9400 -
0.1140 9500 0.2117
0.1152 9600 -
0.1164 9700 -
0.1176 9800 -
0.1188 9900 -
0.1200 10000 0.1987
0.1212 10100 -
0.1224 10200 -
0.1236 10300 -
0.1248 10400 -
0.1260 10500 0.2011
0.1272 10600 -
0.1284 10700 -
0.1296 10800 -
0.1308 10900 -
0.1320 11000 0.1775
0.1332 11100 -
0.1344 11200 -
0.1356 11300 -
0.1368 11400 -
0.1380 11500 0.2048
0.1392 11600 -
0.1404 11700 -
0.1416 11800 -
0.1428 11900 -
0.1440 12000 0.2064
0.1452 12100 -
0.1464 12200 -
0.1476 12300 -
0.1488 12400 -
0.1500 12500 0.1883
0.1512 12600 -
0.1524 12700 -
0.1536 12800 -
0.1548 12900 -
0.1560 13000 0.2084
0.1572 13100 -
0.1584 13200 -
0.1596 13300 -
0.1608 13400 -
0.1620 13500 0.2077
0.1632 13600 -
0.1644 13700 -
0.1656 13800 -
0.1668 13900 -
0.1680 14000 0.1866
0.1692 14100 -
0.1704 14200 -
0.1716 14300 -
0.1728 14400 -
0.1740 14500 0.1859
0.1752 14600 -
0.1764 14700 -
0.1776 14800 -
0.1788 14900 -
0.1800 15000 0.1735
0.1812 15100 -
0.1824 15200 -
0.1836 15300 -
0.1848 15400 -
0.1860 15500 0.171
0.1872 15600 -
0.1884 15700 -
0.1896 15800 -
0.1908 15900 -
0.1920 16000 0.1465
0.1932 16100 -
0.1944 16200 -
0.1956 16300 -
0.1968 16400 -
0.1980 16500 0.1921
0.1992 16600 -
0.2004 16700 -
0.2016 16800 -
0.2028 16900 -
0.2040 17000 0.1669
0.2052 17100 -
0.2064 17200 -
0.2076 17300 -
0.2088 17400 -
0.2100 17500 0.1656
0.2112 17600 -
0.2124 17700 -
0.2136 17800 -
0.2148 17900 -
0.2160 18000 0.1952
0.2172 18100 -
0.2184 18200 -
0.2196 18300 -
0.2208 18400 -
0.2220 18500 0.1658
0.2232 18600 -
0.2244 18700 -
0.2256 18800 -
0.2268 18900 -
0.2280 19000 0.1774
0.2292 19100 -
0.2304 19200 -
0.2316 19300 -
0.2328 19400 -
0.2340 19500 0.1802
0.2352 19600 -
0.2364 19700 -
0.2376 19800 -
0.2388 19900 -
0.2400 20000 0.1724
0.2412 20100 -
0.2424 20200 -
0.2436 20300 -
0.2448 20400 -
0.2460 20500 0.1653
0.2472 20600 -
0.2484 20700 -
0.2496 20800 -
0.2508 20900 -
0.2520 21000 0.1484
0.2532 21100 -
0.2544 21200 -
0.2556 21300 -
0.2568 21400 -
0.2580 21500 0.1544
0.2592 21600 -
0.2604 21700 -
0.2616 21800 -
0.2628 21900 -
0.2640 22000 0.174
0.2652 22100 -
0.2664 22200 -
0.2676 22300 -
0.2688 22400 -
0.2700 22500 0.1488
0.2712 22600 -
0.2724 22700 -
0.2736 22800 -
0.2748 22900 -
0.2760 23000 0.1696
0.2772 23100 -
0.2784 23200 -
0.2796 23300 -
0.2808 23400 -
0.2820 23500 0.1468
0.2832 23600 -
0.2844 23700 -
0.2856 23800 -
0.2868 23900 -
0.2880 24000 0.1738
0.2892 24100 -
0.2904 24200 -
0.2916 24300 -
0.2928 24400 -
0.2940 24500 0.1667
0.2952 24600 -
0.2964 24700 -
0.2976 24800 -
0.2988 24900 -
0.3000 25000 0.1562
0.3012 25100 -
0.3024 25200 -
0.3036 25300 -
0.3048 25400 -
0.3060 25500 0.1628
0.3072 25600 -
0.3084 25700 -
0.3096 25800 -
0.3108 25900 -
0.3120 26000 0.1392
0.3132 26100 -
0.3144 26200 -
0.3156 26300 -
0.3168 26400 -
0.3180 26500 0.1507
0.3192 26600 -
0.3204 26700 -
0.3216 26800 -
0.3228 26900 -
0.3240 27000 0.1646
0.3252 27100 -
0.3264 27200 -
0.3276 27300 -
0.3288 27400 -
0.3300 27500 0.1433
0.3312 27600 -
0.3324 27700 -
0.3336 27800 -
0.3348 27900 -
0.3360 28000 0.1689
0.3372 28100 -
0.3384 28200 -
0.3396 28300 -
0.3408 28400 -
0.3420 28500 0.1432
0.3432 28600 -
0.3444 28700 -
0.3456 28800 -
0.3468 28900 -
0.3480 29000 0.1534
0.3492 29100 -
0.3504 29200 -
0.3516 29300 -
0.3528 29400 -
0.3540 29500 0.1487
0.3552 29600 -
0.3564 29700 -
0.3576 29800 -
0.3588 29900 -
0.3600 30000 0.1439
0.3612 30100 -
0.3624 30200 -
0.3636 30300 -
0.3648 30400 -
0.3660 30500 0.1397
0.3672 30600 -
0.3684 30700 -
0.3696 30800 -
0.3708 30900 -
0.3720 31000 0.1542
0.3732 31100 -
0.3744 31200 -
0.3756 31300 -
0.3768 31400 -
0.3780 31500 0.1448
0.3792 31600 -
0.3804 31700 -
0.3816 31800 -
0.3828 31900 -
0.3840 32000 0.1608
0.3852 32100 -
0.3864 32200 -
0.3876 32300 -
0.3888 32400 -
0.3900 32500 0.1486
0.3912 32600 -
0.3924 32700 -
0.3936 32800 -
0.3948 32900 -
0.3960 33000 0.1274
0.3972 33100 -
0.3984 33200 -
0.3996 33300 -
0.4008 33400 -
0.4020 33500 0.1451
0.4032 33600 -
0.4044 33700 -
0.4056 33800 -
0.4068 33900 -
0.4080 34000 0.1316
0.4092 34100 -
0.4104 34200 -
0.4116 34300 -
0.4128 34400 -
0.4140 34500 0.1306
0.4152 34600 -
0.4164 34700 -
0.4176 34800 -
0.4188 34900 -
0.4200 35000 0.1382
0.4212 35100 -
0.4224 35200 -
0.4236 35300 -
0.4248 35400 -
0.4260 35500 0.1322
0.4272 35600 -
0.4284 35700 -
0.4296 35800 -
0.4308 35900 -
0.4320 36000 0.1617
0.4332 36100 -
0.4344 36200 -
0.4356 36300 -
0.4368 36400 -
0.4380 36500 0.14
0.4392 36600 -
0.4404 36700 -
0.4416 36800 -
0.4428 36900 -
0.4440 37000 0.1321
0.4452 37100 -
0.4464 37200 -
0.4476 37300 -
0.4488 37400 -
0.4500 37500 0.1464
0.4512 37600 -
0.4524 37700 -
0.4536 37800 -
0.4548 37900 -
0.4560 38000 0.1236
0.4572 38100 -
0.4584 38200 -
0.4596 38300 -
0.4608 38400 -
0.4620 38500 0.147
0.4632 38600 -
0.4644 38700 -
0.4656 38800 -
0.4668 38900 -
0.4680 39000 0.1376
0.4692 39100 -
0.4704 39200 -
0.4716 39300 -
0.4728 39400 -
0.4740 39500 0.1342
0.4752 39600 -
0.4764 39700 -
0.4776 39800 -
0.4788 39900 -
0.4800 40000 0.123
0.4812 40100 -
0.4824 40200 -
0.4836 40300 -
0.4848 40400 -
0.4860 40500 0.1312
0.4872 40600 -
0.4884 40700 -
0.4896 40800 -
0.4908 40900 -
0.4920 41000 0.1325
0.4932 41100 -
0.4944 41200 -
0.4956 41300 -
0.4968 41400 -
0.4980 41500 0.1203
0.4992 41600 -
0.5004 41700 -
0.5016 41800 -
0.5028 41900 -
0.5040 42000 0.1258
0.5052 42100 -
0.5064 42200 -
0.5076 42300 -
0.5088 42400 -
0.5100 42500 0.141
0.5112 42600 -
0.5124 42700 -
0.5136 42800 -
0.5148 42900 -
0.5160 43000 0.1473
0.5172 43100 -
0.5184 43200 -
0.5196 43300 -
0.5208 43400 -
0.5220 43500 0.1247
0.5232 43600 -
0.5244 43700 -
0.5256 43800 -
0.5268 43900 -
0.5280 44000 0.1259
0.5292 44100 -
0.5304 44200 -
0.5316 44300 -
0.5328 44400 -
0.5340 44500 0.1372
0.5352 44600 -
0.5364 44700 -
0.5376 44800 -
0.5388 44900 -
0.5400 45000 0.1413
0.5412 45100 -
0.5424 45200 -
0.5436 45300 -
0.5448 45400 -
0.5460 45500 0.1157
0.5472 45600 -
0.5484 45700 -
0.5496 45800 -
0.5508 45900 -
0.5520 46000 0.127
0.5532 46100 -
0.5544 46200 -
0.5556 46300 -
0.5568 46400 -
0.5580 46500 0.1202
0.5592 46600 -
0.5604 46700 -
0.5616 46800 -
0.5628 46900 -
0.5640 47000 0.1199
0.5652 47100 -
0.5664 47200 -
0.5676 47300 -
0.5688 47400 -
0.5700 47500 0.1309
0.5712 47600 -
0.5724 47700 -
0.5736 47800 -
0.5748 47900 -
0.5760 48000 0.1276
0.5772 48100 -
0.5784 48200 -
0.5796 48300 -
0.5808 48400 -
0.5820 48500 0.1278
0.5832 48600 -
0.5844 48700 -
0.5856 48800 -
0.5868 48900 -
0.5880 49000 0.1175
0.5892 49100 -
0.5904 49200 -
0.5916 49300 -
0.5928 49400 -
0.5940 49500 0.1327
0.5952 49600 -
0.5964 49700 -
0.5976 49800 -
0.5988 49900 -
0.6000 50000 0.1109
0.6012 50100 -
0.6024 50200 -
0.6036 50300 -
0.6048 50400 -
0.6060 50500 0.1248
0.6072 50600 -
0.6084 50700 -
0.6096 50800 -
0.6108 50900 -
0.6120 51000 0.1296
0.6132 51100 -
0.6144 51200 -
0.6156 51300 -
0.6168 51400 -
0.6180 51500 0.1323
0.6192 51600 -
0.6204 51700 -
0.6216 51800 -
0.6228 51900 -
0.6240 52000 0.1155
0.6252 52100 -
0.6264 52200 -
0.6276 52300 -
0.6288 52400 -
0.6300 52500 0.1245
0.6312 52600 -
0.6324 52700 -
0.6336 52800 -
0.6348 52900 -
0.6360 53000 0.1238
0.6372 53100 -
0.6384 53200 -
0.6396 53300 -
0.6408 53400 -
0.6420 53500 0.12
0.6432 53600 -
0.6444 53700 -
0.6456 53800 -
0.6468 53900 -
0.6480 54000 0.1116
0.6492 54100 -
0.6504 54200 -
0.6516 54300 -
0.6528 54400 -
0.6540 54500 0.1305
0.6552 54600 -
0.6564 54700 -
0.6576 54800 -
0.6588 54900 -
0.6600 55000 0.1355
0.6612 55100 -
0.6624 55200 -
0.6636 55300 -
0.6648 55400 -
0.6660 55500 0.1139
0.6672 55600 -
0.6684 55700 -
0.6696 55800 -
0.6708 55900 -
0.6720 56000 0.1251
0.6732 56100 -
0.6744 56200 -
0.6756 56300 -
0.6768 56400 -
0.6780 56500 0.1211
0.6792 56600 -
0.6804 56700 -
0.6816 56800 -
0.6828 56900 -
0.6840 57000 0.1123
0.6852 57100 -
0.6864 57200 -
0.6876 57300 -
0.6888 57400 -
0.6900 57500 0.1071
0.6912 57600 -
0.6924 57700 -
0.6936 57800 -
0.6948 57900 -
0.6960 58000 0.112
0.6972 58100 -
0.6984 58200 -
0.6996 58300 -
0.7008 58400 -
0.7020 58500 0.1038
0.7032 58600 -
0.7044 58700 -
0.7056 58800 -
0.7068 58900 -
0.7080 59000 0.1238
0.7092 59100 -
0.7104 59200 -
0.7116 59300 -
0.7128 59400 -
0.7140 59500 0.1001
0.7152 59600 -
0.7164 59700 -
0.7176 59800 -
0.7188 59900 -
0.7200 60000 0.0948
0.7212 60100 -
0.7224 60200 -
0.7236 60300 -
0.7248 60400 -
0.7260 60500 0.1271
0.7272 60600 -
0.7284 60700 -
0.7296 60800 -
0.7308 60900 -
0.7320 61000 0.1117
0.7332 61100 -
0.7344 61200 -
0.7356 61300 -
0.7368 61400 -
0.7380 61500 0.1122
0.7392 61600 -
0.7404 61700 -
0.7416 61800 -
0.7428 61900 -
0.7440 62000 0.0972
0.7452 62100 -
0.7464 62200 -
0.7476 62300 -
0.7488 62400 -
0.7500 62500 0.1135
0.7512 62600 -
0.7524 62700 -
0.7536 62800 -
0.7548 62900 -
0.7560 63000 0.1092
0.7572 63100 -
0.7584 63200 -
0.7596 63300 -
0.7608 63400 -
0.7620 63500 0.1155
0.7632 63600 -
0.7644 63700 -
0.7656 63800 -
0.7668 63900 -
0.7680 64000 0.1065
0.7692 64100 -
0.7704 64200 -
0.7716 64300 -
0.7728 64400 -
0.7740 64500 0.1211
0.7752 64600 -
0.7764 64700 -
0.7776 64800 -
0.7788 64900 -
0.7800 65000 0.116
0.7812 65100 -
0.7824 65200 -
0.7836 65300 -
0.7848 65400 -
0.7860 65500 0.1138
0.7872 65600 -
0.7884 65700 -
0.7896 65800 -
0.7908 65900 -
0.7920 66000 0.1155
0.7932 66100 -
0.7944 66200 -
0.7956 66300 -
0.7968 66400 -
0.7980 66500 0.1059
0.7992 66600 -
0.8004 66700 -
0.8016 66800 -
0.8028 66900 -
0.8040 67000 0.1189
0.8052 67100 -
0.8064 67200 -
0.8076 67300 -
0.8088 67400 -
0.8100 67500 0.1089
0.8112 67600 -
0.8124 67700 -
0.8136 67800 -
0.8148 67900 -
0.8160 68000 0.1016
0.8172 68100 -
0.8184 68200 -
0.8196 68300 -
0.8208 68400 -
0.8220 68500 0.121
0.8232 68600 -
0.8244 68700 -
0.8256 68800 -
0.8268 68900 -
0.8280 69000 0.1185
0.8292 69100 -
0.8304 69200 -
0.8316 69300 -
0.8328 69400 -
0.8340 69500 0.1026
0.8352 69600 -
0.8364 69700 -
0.8376 69800 -
0.8388 69900 -
0.8400 70000 0.1209
0.8412 70100 -
0.8424 70200 -
0.8436 70300 -
0.8448 70400 -
0.8460 70500 0.1103
0.8472 70600 -
0.8484 70700 -
0.8496 70800 -
0.8508 70900 -
0.8520 71000 0.1098
0.8532 71100 -
0.8544 71200 -
0.8556 71300 -
0.8568 71400 -
0.8580 71500 0.1055
0.8592 71600 -
0.8604 71700 -
0.8616 71800 -
0.8628 71900 -
0.8640 72000 0.1045
0.8652 72100 -
0.8664 72200 -
0.8676 72300 -
0.8688 72400 -
0.8700 72500 0.1126
0.8712 72600 -
0.8724 72700 -
0.8736 72800 -
0.8748 72900 -
0.8760 73000 0.1058
0.8772 73100 -
0.8784 73200 -
0.8796 73300 -
0.8808 73400 -
0.8820 73500 0.1138
0.8832 73600 -
0.8844 73700 -
0.8856 73800 -
0.8868 73900 -
0.8880 74000 0.1071
0.8892 74100 -
0.8904 74200 -
0.8916 74300 -
0.8928 74400 -
0.8940 74500 0.1091
0.8952 74600 -
0.8964 74700 -
0.8976 74800 -
0.8988 74900 -
0.9000 75000 0.1143
0.9012 75100 -
0.9024 75200 -

Framework Versions

  • Python: 3.12.10
  • Sentence Transformers: 4.1.0
  • Transformers: 4.51.3
  • PyTorch: 2.6.0+cu124
  • Accelerate: 1.8.1
  • Datasets: 3.6.0
  • Tokenizers: 0.21.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
Downloads last month
6
Safetensors
Model size
471M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for lingtrain/labse-chuvash-2

Finetuned
(71)
this model