SentenceTransformer based on Alibaba-NLP/gte-large-en-v1.5

This is a sentence-transformers model finetuned from Alibaba-NLP/gte-large-en-v1.5. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: Alibaba-NLP/gte-large-en-v1.5
  • Maximum Sequence Length: 8192 tokens
  • Output Dimensionality: 1024 tokens
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 8192, 'do_lower_case': False}) with Transformer model: NewModel 
  (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the ๐Ÿค— Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
    'steinel led floodlight with sensor xled home 1 silver led floodlights anylamp produced by steinel identifiers is 4007841002688 category of toolsandhomeimprovement',
    'steinel led floodlight with sensor xled home 1 silver led lighting anylamp produced by steinel identifiers is 4007841002688 category of toolsandhomeimprovement',
    'desk organizers quillcom durable bookends with reinforced rib designheavygauge steel construction9 height identifiers is 90117bebk category of officeproducts',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Training Details

Training Dataset

Unnamed Dataset

  • Size: 281,342 training samples
  • Columns: anchor and positive
  • Approximate statistics based on the first 1000 samples:
    anchor positive
    type string string
    details
    • min: 24 tokens
    • mean: 81.17 tokens
    • max: 941 tokens
    • min: 23 tokens
    • mean: 80.26 tokens
    • max: 1004 tokens
  • Samples:
    anchor positive
    ironwood pharmaceuticals inc class a a 1 full quote netdaniacom pharmaceuticals produced by source nasdaq identifiers is isinus46333x1081 category of automotive ironwood pharmaceuticals inc class a pharmaceuticals a 1 news netdaniacom produced by source nasdaq identifiers is isinus46333x1081 category of automotive
    873010s21 hp 600gb 12g 10k 25 dp sas hdd null price 873010s2110pack new 873010s21 600gb hdd 10 pack wholesale description10 x 600gb 25inch serial attached scsi sassff digitally signed ds 12g dual portenterprise hotplug 512n 10k hard drivein hpe drive tray as picturedfor g1g7 proliant sas serversgenuine number and firmwaregenuine certified drivepart numbers option part 873010b21 smartbuy 873010s21 produced by hp enterprise identifiers is 873010s2110pack category of computersandaccessories key specifications are specifications category proliant harddrive subcategory 10k generation sas part number 873010s2110pack products id 489761 type hard drive hotswap capacity 600gb interface serial attached scsi spindle speed 10000rpm ports dual port data transfer rate 12gbs bytes per sector 512n 873010s21 hp 600gb 12g 10k 25 dp sas hdd null price 873010s21 new 873010s21 600gb hdd wholesale description600gb 25inch serial attached scsi sassff digitally signed ds 12g dual portenterprise hotplug 512n 10k hard drivein hpe drive tray as picturedfor g1g7 proliant sas serversgenuine number and firmwaregenuine certified drivepart numbers option part 873010b21 smartbuy 873010s21 produced by hp enterprise identifiers is 873010s21 category of computersandaccessories key specifications are specifications category proliant harddrive subcategory 10k generation sas part number 873010s21 products id 489758 type hard drive hotswap capacity 600gb interface serial attached scsi spindle speed 10000rpm ports dual port data transfer rate 12gbs bytes per sector 512n
    armrest fabric gb 2010 audi a4avant argentina market body middle front pr6e3gb model data prn0ln5fn2en2m gb identifiers is 8k0864207a category of automotive armrest fabric gb 2009 audi a5s5 coupesportback south africa market body middle front pr6e3gb model data coupeprn2e gb identifiers is 8k0864207a category of automotive
  • Loss: CachedMultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Evaluation Dataset

Unnamed Dataset

  • Size: 70,336 evaluation samples
  • Columns: anchor and positive
  • Approximate statistics based on the first 1000 samples:
    anchor positive
    type string string
    details
    • min: 25 tokens
    • mean: 80.88 tokens
    • max: 542 tokens
    • min: 24 tokens
    • mean: 79.18 tokens
    • max: 1004 tokens
  • Samples:
    anchor positive
    rennline race hook front universal 8 in red each 2000 bmw 323i base wagon chassis panels sheet metal page 3 identifiers is rene01r8 category of automotive rennline race hook front universal 8 in red each 2000 bmw 323i base wagon chassis panels sheet metal page 3 identifiers is rene01r8 category of automotive
    happy new year 2017 peace patch icon card design vector image patch images over 13 000 happy new year 2017 greeting card design with varsity college typography and stitch patch peace symbol icon as number eps10 vector vector image identifiers is 14478945 category of officeproducts happy new year 2017 peace patch icon card design vector image happy new year 2017 greeting card design with varsity college typography and stitch patch peace symbol icon as number eps10 vector download a free preview or high quality adobe illustrator ai eps pdf resolution jpeg versions identifiers is 14478945 category of officeproducts
    hp deskjet d4155 cartridges for ink jet printers quillcom yields up to 399 pagessized and priced for occasional printingoriginal hp ink a little less ink at very affordable price identifiers is 901d8j33an category of officeproducts hp photosmart c4150 cartridges for ink jet printers quillcom yields up to 399 pagessized and priced for occasional printingoriginal hp ink a little less ink at very affordable price identifiers is 901d8j33an category of officeproducts
  • Loss: CachedMultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • learning_rate: 1e-05
  • num_train_epochs: 2
  • warmup_ratio: 0.1
  • fp16: True
  • auto_find_batch_size: True
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 8
  • per_device_eval_batch_size: 8
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 1e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 2
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: True
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • eval_use_gather_object: False
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional

Training Logs

Epoch Step Training Loss loss
0.1990 7000 0.0057 0.0026
0.3981 14000 0.0019 0.0018
0.5971 21000 0.0016 0.0012
0.7962 28000 0.001 0.0009
0.9952 35000 0.001 0.0009
1.1943 42000 0.0007 0.0008
1.3933 49000 0.0004 0.0009
1.5924 56000 0.0003 0.0009
1.7914 63000 0.0002 0.0008

Framework Versions

  • Python: 3.10.13
  • Sentence Transformers: 3.0.1
  • Transformers: 4.44.0
  • PyTorch: 2.2.1
  • Accelerate: 0.33.0
  • Datasets: 2.21.0
  • Tokenizers: 0.19.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

CachedMultipleNegativesRankingLoss

@misc{gao2021scaling,
    title={Scaling Deep Contrastive Learning Batch Size under Memory Limited Setup}, 
    author={Luyu Gao and Yunyi Zhang and Jiawei Han and Jamie Callan},
    year={2021},
    eprint={2101.06983},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}
Downloads last month
16
Safetensors
Model size
434M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for spl4shedEdu/gte_IAB

Finetuned
(20)
this model