tomaarsen's picture
tomaarsen HF staff
Add new SentenceTransformer model
354ceec verified
metadata
tags:
  - sentence-transformers
  - sentence-similarity
  - feature-extraction
  - generated_from_trainer
  - dataset_size:1000000
  - loss:DenoisingAutoEncoderLoss
base_model: google-bert/bert-base-uncased
widget:
  - source_sentence: >-
      He wound up homeless in the Mission District, playing for change in the
      streets.
    sentences:
      - He wound up homeless, playing in streets
      - It line-up of professional footballers,, firefighters and survivors.
      - >-
        A (Dakota) belonging to the Dutch Air crash-landed near Beswick (Beswick
        Creek now Barunga?
  - source_sentence: >-
      The division remained near Arkhangelsk until the beginning of August, when
      it was shipped across the White Sea to Murmansk.
    sentences:
      - >-
        The division remained near Arkhangelsk until the beginning of August,
        when it was shipped across White Sea to Murmansk.
      - The building is and.
      - Maxim Triesman born October) is politician banker trade union leader.
  - source_sentence: >-
      "Leper," the last song on the album, was left as an instrumental as
      Jourgensen had left the studio earlier than scheduled and did not care to
      write any lyrics.
    sentences:
      - >-
        There produced the viral host cells processes, more suitable environment
        for viral replication transcription.
      - As a the to
      - >-
        Leper, the song on the album was left as an as Jourgensen had left the
        studio scheduled and did care to any lyrics
  - source_sentence: >-
      Prince and princess have given Gerda her their golden coach so she can
      continue her search for Kay.
    sentences:
      - >-
        and princess given Gerda their golden coach so she can her search for
        Kay.
      - handled the cinematography
      - >-
        University Hoekstra was Professor of and Department of Multidisciplinary
        Water.
  - source_sentence: >-
      While the early models stayed close to their original form, eight
      subsequent generations varied substantially in size and styling.
    sentences:
      - >-
        While the stayed close their, eight generations varied substantially in
        size and
      - >-
        Their influence, his's own tradition, his special organization all
        combined to divert the young into a political career
      -  U  cross of the river are a recent
datasets:
  - princeton-nlp/datasets-for-simcse
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
  - pearson_cosine
  - spearman_cosine
co2_eq_emissions:
  emissions: 556.5173349579181
  energy_consumed: 1.4317326253991955
  source: codecarbon
  training_type: fine-tuning
  on_cloud: false
  cpu_model: 13th Gen Intel(R) Core(TM) i7-13700K
  ram_total_size: 31.777088165283203
  hours_used: 4.403
  hardware_used: 1 x NVIDIA GeForce RTX 3090
model-index:
  - name: SentenceTransformer based on google-bert/bert-base-uncased
    results:
      - task:
          type: semantic-similarity
          name: Semantic Similarity
        dataset:
          name: sts dev
          type: sts-dev
        metrics:
          - type: pearson_cosine
            value: 0.6732163313155011
            name: Pearson Cosine
          - type: spearman_cosine
            value: 0.6765812652563955
            name: Spearman Cosine
      - task:
          type: semantic-similarity
          name: Semantic Similarity
        dataset:
          name: sts test
          type: sts-test
        metrics:
          - type: pearson_cosine
            value: 0.6424591318281525
            name: Pearson Cosine
          - type: spearman_cosine
            value: 0.6322331484751982
            name: Spearman Cosine

SentenceTransformer based on google-bert/bert-base-uncased

This is a sentence-transformers model finetuned from google-bert/bert-base-uncased on the datasets-for-simcse dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 75, 'do_lower_case': False}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("tomaarsen/bert-base-uncased-stsb-tsdae")
# Run inference
sentences = [
    'While the early models stayed close to their original form, eight subsequent generations varied substantially in size and styling.',
    'While the stayed close their, eight generations varied substantially in size and',
    '“ U ” cross of the river are a recent',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Semantic Similarity

Metric sts-dev sts-test
pearson_cosine 0.6732 0.6425
spearman_cosine 0.6766 0.6322

Training Details

Training Dataset

datasets-for-simcse

  • Dataset: datasets-for-simcse at e145e8b
  • Size: 1,000,000 training samples
  • Columns: text and noisy
  • Approximate statistics based on the first 1000 samples:
    text noisy
    type string string
    details
    • min: 3 tokens
    • mean: 27.96 tokens
    • max: 75 tokens
    • min: 3 tokens
    • mean: 17.68 tokens
    • max: 75 tokens
  • Samples:
    text noisy
    White was born in Iver, England. White was born in Iver,
    The common mangrove plants are "Rhizophora mucronata", "Sonneratia caseolaris", "Avicennia" spp., and "Aegiceras corniculatum". plants are Rhizophora mucronata" "Sonneratia, spp.,".
    H3K9ac and H3K14ac have been shown to be part of the active promoter state. H3K9ac been part of active promoter state.
  • Loss: DenoisingAutoEncoderLoss

Evaluation Dataset

datasets-for-simcse

  • Dataset: datasets-for-simcse at e145e8b
  • Size: 1,000,000 evaluation samples
  • Columns: text and noisy
  • Approximate statistics based on the first 1000 samples:
    text noisy
    type string string
    details
    • min: 3 tokens
    • mean: 28.12 tokens
    • max: 75 tokens
    • min: 3 tokens
    • mean: 17.79 tokens
    • max: 66 tokens
  • Samples:
    text noisy
    Philippe Hervé (born 16 April 1959) is a French water polo player. Philippe Hervé born April 1959 is French
    lies at the very edge of Scottish offshore waters, close to the maritime boundary with Norway. the edge Scottish offshore waters close to maritime boundary with Norway
    The place is an exceptional example of the forced migration of convicts (Vinegar Hill rebels) and the development associated with punishment and reform, particularly convict labour and the associated coal mines. The is an example of forced migration of convicts (Vinegar rebels and the development punishment and reform, particularly convict and the associated coal.
  • Loss: DenoisingAutoEncoderLoss

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • learning_rate: 3e-05
  • num_train_epochs: 1
  • warmup_ratio: 0.1
  • fp16: True

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 8
  • per_device_eval_batch_size: 8
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 3e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 1
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss Validation Loss sts-dev_spearman_cosine sts-test_spearman_cosine
-1 -1 - - 0.3173 -
0.0081 1000 7.5472 - - -
0.0162 2000 6.0196 - - -
0.0242 3000 5.4872 - - -
0.0323 4000 5.1452 - - -
0.0404 5000 4.8099 - - -
0.0485 6000 4.5211 - - -
0.0566 7000 4.2967 - - -
0.0646 8000 4.1411 - - -
0.0727 9000 4.031 - - -
0.0808 10000 3.9636 3.8297 0.7237 -
0.0889 11000 3.9046 - - -
0.0970 12000 3.8138 - - -
0.1051 13000 3.7859 - - -
0.1131 14000 3.7237 - - -
0.1212 15000 3.6881 - - -
0.1293 16000 3.6133 - - -
0.1374 17000 3.5777 - - -
0.1455 18000 3.5285 - - -
0.1535 19000 3.4974 - - -
0.1616 20000 3.4421 3.3523 0.6978 -
0.1697 21000 3.416 - - -
0.1778 22000 3.4143 - - -
0.1859 23000 3.3661 - - -
0.1939 24000 3.3408 - - -
0.2020 25000 3.3079 - - -
0.2101 26000 3.2873 - - -
0.2182 27000 3.2639 - - -
0.2263 28000 3.2323 - - -
0.2343 29000 3.2416 - - -
0.2424 30000 3.2117 3.1015 0.6895 -
0.2505 31000 3.1868 - - -
0.2586 32000 3.1576 - - -
0.2667 33000 3.1619 - - -
0.2747 34000 3.1445 - - -
0.2828 35000 3.1387 - - -
0.2909 36000 3.1159 - - -
0.2990 37000 3.09 - - -
0.3071 38000 3.0771 - - -
0.3152 39000 3.065 - - -
0.3232 40000 3.0589 2.9535 0.6885 -
0.3313 41000 3.0539 - - -
0.3394 42000 3.0211 - - -
0.3475 43000 3.0158 - - -
0.3556 44000 3.0172 - - -
0.3636 45000 2.9912 - - -
0.3717 46000 2.9776 - - -
0.3798 47000 2.9539 - - -
0.3879 48000 2.9753 - - -
0.3960 49000 2.9467 - - -
0.4040 50000 2.9429 2.8288 0.6830 -
0.4121 51000 2.9243 - - -
0.4202 52000 2.9273 - - -
0.4283 53000 2.9118 - - -
0.4364 54000 2.9068 - - -
0.4444 55000 2.8961 - - -
0.4525 56000 2.8621 - - -
0.4606 57000 2.8825 - - -
0.4687 58000 2.8466 - - -
0.4768 59000 2.868 - - -
0.4848 60000 2.8372 2.7335 0.6871 -
0.4929 61000 2.8322 - - -
0.5010 62000 2.8239 - - -
0.5091 63000 2.8148 - - -
0.5172 64000 2.8137 - - -
0.5253 65000 2.8043 - - -
0.5333 66000 2.7973 - - -
0.5414 67000 2.7739 - - -
0.5495 68000 2.7694 - - -
0.5576 69000 2.755 - - -
0.5657 70000 2.7846 2.6422 0.6773 -
0.5737 71000 2.7246 - - -
0.5818 72000 2.7438 - - -
0.5899 73000 2.7314 - - -
0.5980 74000 2.7213 - - -
0.6061 75000 2.7402 - - -
0.6141 76000 2.6955 - - -
0.6222 77000 2.7131 - - -
0.6303 78000 2.6951 - - -
0.6384 79000 2.6812 - - -
0.6465 80000 2.6844 2.5743 0.6827 -
0.6545 81000 2.665 - - -
0.6626 82000 2.6528 - - -
0.6707 83000 2.6819 - - -
0.6788 84000 2.6529 - - -
0.6869 85000 2.6665 - - -
0.6949 86000 2.6554 - - -
0.7030 87000 2.6299 - - -
0.7111 88000 2.659 - - -
0.7192 89000 2.632 - - -
0.7273 90000 2.6209 2.5051 0.6782 -
0.7354 91000 2.6023 - - -
0.7434 92000 2.6226 - - -
0.7515 93000 2.6057 - - -
0.7596 94000 2.601 - - -
0.7677 95000 2.5888 - - -
0.7758 96000 2.5811 - - -
0.7838 97000 2.565 - - -
0.7919 98000 2.5727 - - -
0.8 99000 2.5863 - - -
0.8081 100000 2.5534 2.4526 0.6799 -
0.8162 101000 2.5423 - - -
0.8242 102000 2.5655 - - -
0.8323 103000 2.5394 - - -
0.8404 104000 2.5217 - - -
0.8485 105000 2.5534 - - -
0.8566 106000 2.5264 - - -
0.8646 107000 2.5481 - - -
0.8727 108000 2.5508 - - -
0.8808 109000 2.5302 - - -
0.8889 110000 2.5223 2.4048 0.6771 -
0.8970 111000 2.5274 - - -
0.9051 112000 2.515 - - -
0.9131 113000 2.5088 - - -
0.9212 114000 2.5035 - - -
0.9293 115000 2.495 - - -
0.9374 116000 2.5066 - - -
0.9455 117000 2.4858 - - -
0.9535 118000 2.4803 - - -
0.9616 119000 2.506 - - -
0.9697 120000 2.4906 2.3738 0.6766 -
0.9778 121000 2.5027 - - -
0.9859 122000 2.4858 - - -
0.9939 123000 2.4928 - - -
-1 -1 - - - 0.6322

Environmental Impact

Carbon emissions were measured using CodeCarbon.

  • Energy Consumed: 1.432 kWh
  • Carbon Emitted: 0.557 kg of CO2
  • Hours Used: 4.403 hours

Training Hardware

  • On Cloud: No
  • GPU Model: 1 x NVIDIA GeForce RTX 3090
  • CPU Model: 13th Gen Intel(R) Core(TM) i7-13700K
  • RAM Size: 31.78 GB

Framework Versions

  • Python: 3.11.6
  • Sentence Transformers: 3.4.0.dev0
  • Transformers: 4.48.0.dev0
  • PyTorch: 2.5.0+cu121
  • Accelerate: 0.35.0.dev0
  • Datasets: 2.20.0
  • Tokenizers: 0.21.0

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

DenoisingAutoEncoderLoss

@inproceedings{wang-2021-TSDAE,
    title = "TSDAE: Using Transformer-based Sequential Denoising Auto-Encoderfor Unsupervised Sentence Embedding Learning",
    author = "Wang, Kexin and Reimers, Nils and Gurevych, Iryna",
    booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2021",
    month = nov,
    year = "2021",
    address = "Punta Cana, Dominican Republic",
    publisher = "Association for Computational Linguistics",
    pages = "671--688",
    url = "https://arxiv.org/abs/2104.06979",
}