Edit model card

SentenceTransformer based on microsoft/mdeberta-v3-base

This is a sentence-transformers model finetuned from microsoft/mdeberta-v3-base. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: microsoft/mdeberta-v3-base
  • Maximum Sequence Length: 1024 tokens
  • Output Dimensionality: 768 tokens
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 1024, 'do_lower_case': False}) with Transformer model: DebertaV2Model 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("BlackBeenie/mdeberta-v3-base-msmarco-v3-bpr")
# Run inference
sentences = [
    'definition of stoop',
    'Define stoop: to bend the body or a part of the body forward and downward sometimes simultaneously bending the knees â\x80\x94 stoop in a sentence to bend the body or a part of the body forward and downward sometimes simultaneously bending the kneesâ\x80¦ See the full definition',
    "Definition of stoop written for English Language Learners from the Merriam-Webster Learner's Dictionary with audio pronunciations, usage examples, and count/noncount noun labels. Learner's Dictionary mobile search",
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Training Details

Training Dataset

Unnamed Dataset

  • Size: 498,970 training samples
  • Columns: sentence_0, sentence_1, and sentence_2
  • Approximate statistics based on the first 1000 samples:
    sentence_0 sentence_1 sentence_2
    type string string string
    details
    • min: 4 tokens
    • mean: 10.61 tokens
    • max: 40 tokens
    • min: 17 tokens
    • mean: 96.41 tokens
    • max: 259 tokens
    • min: 14 tokens
    • mean: 92.21 tokens
    • max: 250 tokens
  • Samples:
    sentence_0 sentence_1 sentence_2
    how much does it cost to paint a interior house Interior House Painting Cost Factors. Generally, it will take a minimum of two gallons of paint to cover a room. At the highest end, paint will cost anywhere between $30 and $60 per gallon and come in three different finishes: flat, semi-gloss or high-gloss.Flat finishes are the least shiny and are best suited for areas requiring frequent cleaning.rovide a few details about your project and receive competitive quotes from local pros. The average national cost to paint a home interior is $1,671, with most homeowners spending between $966 and $2,426. How Much to Charge to Paint the Interior of a House (and how much not to charge) Let me give you an example - stay with me here. Imagine you drop all of your painting estimates by 20% to win more jobs. Maybe you'll close $10,000 in sales instead of $6,000 (because you had a better price - you landed an extra job)...
    when is s corp taxes due If you form a corporate entity for your small business, regardless of whether it's taxed as a C or S corporation, a tax return must be filed with the Internal Revenue Service on its due date each year. Corporate tax returns are always due on the 15th day of the third month following the close of the tax year. The actual day that the tax return filing deadline falls on, however, isn't the same for every corporation. In Summary. 1 S-corporations are pass-through entities. 2 Form 1120S is the form used for an S-corp’s annual tax return. 3 Shareholders do not have to pay self-employment tax on their share of an S-corp’s profits.
    what are disaccharides Disaccharides are formed when two monosaccharides are joined together and a molecule of water is removed, a process known as dehydration reaction. For example; milk sugar (lactose) is made from glucose and galactose whereas the sugar from sugar cane and sugar beets (sucrose) is made from glucose and fructose.altose, another notable disaccharide, is made up of two glucose molecules. The two monosaccharides are bonded via a dehydration reaction (also called a condensation reaction or dehydration synthesis) that leads to the loss of a molecule of water and formation of a glycosidic bond. No. Sugars and starches are types of carbohydrates,(ex: monosaccharides, disaccharides) Lipids are much different.o. Sugars and starches are types of carbohydrates,(ex: monosaccharides, disaccharides) Lipids are much different.
  • Loss: beir.losses.bpr_loss.BPRLoss

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 32
  • per_device_eval_batch_size: 32
  • num_train_epochs: 15
  • fp16: True
  • multi_dataset_batch_sampler: round_robin

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 32
  • per_device_eval_batch_size: 32
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1
  • num_train_epochs: 15
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.0
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • eval_use_gather_object: False
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: round_robin

Training Logs

Click to expand
Epoch Step Training Loss
0.0321 500 7.0196
0.0641 1000 2.0193
0.0962 1500 1.4466
0.1283 2000 1.1986
0.1603 2500 1.0912
0.1924 3000 1.0179
0.2245 3500 0.9659
0.2565 4000 0.9229
0.2886 4500 0.9034
0.3207 5000 0.871
0.3527 5500 0.8474
0.3848 6000 0.8247
0.4169 6500 0.8377
0.4489 7000 0.8119
0.4810 7500 0.8042
0.5131 8000 0.7831
0.5451 8500 0.7667
0.5772 9000 0.7653
0.6092 9500 0.7502
0.6413 10000 0.7615
0.6734 10500 0.7435
0.7054 11000 0.7346
0.7375 11500 0.718
0.7696 12000 0.711
0.8016 12500 0.6963
0.8337 13000 0.6969
0.8658 13500 0.6937
0.8978 14000 0.6721
0.9299 14500 0.6902
0.9620 15000 0.6783
0.9940 15500 0.6669
1.0 15593 -
1.0261 16000 0.689
1.0582 16500 0.6549
1.0902 17000 0.6354
1.1223 17500 0.6013
1.1544 18000 0.6091
1.1864 18500 0.5907
1.2185 19000 0.5979
1.2506 19500 0.5724
1.2826 20000 0.5718
1.3147 20500 0.5851
1.3468 21000 0.5716
1.3788 21500 0.5568
1.4109 22000 0.5502
1.4430 22500 0.5591
1.4750 23000 0.5688
1.5071 23500 0.5484
1.5392 24000 0.531
1.5712 24500 0.5445
1.6033 25000 0.5269
1.6353 25500 0.55
1.6674 26000 0.537
1.6995 26500 0.5259
1.7315 27000 0.5153
1.7636 27500 0.5184
1.7957 28000 0.5154
1.8277 28500 0.5279
1.8598 29000 0.5267
1.8919 29500 0.4938
1.9239 30000 0.5088
1.9560 30500 0.516
1.9881 31000 0.4998
2.0 31186 -
2.0201 31500 0.5252
2.0522 32000 0.4998
2.0843 32500 0.484
2.1163 33000 0.4612
2.1484 33500 0.4617
2.1805 34000 0.4441
2.2125 34500 0.4653
2.2446 35000 0.4592
2.2767 35500 0.4347
2.3087 36000 0.4557
2.3408 36500 0.4401
2.3729 37000 0.436
2.4049 37500 0.4315
2.4370 38000 0.4447
2.4691 38500 0.4258
2.5011 39000 0.4275
2.5332 39500 0.4142
2.5653 40000 0.434
2.5973 40500 0.4222
2.6294 41000 0.4284
2.6615 41500 0.4187
2.6935 42000 0.4156
2.7256 42500 0.4054
2.7576 43000 0.4182
2.7897 43500 0.4142
2.8218 44000 0.4152
2.8538 44500 0.421
2.8859 45000 0.403
2.9180 45500 0.4003
2.9500 46000 0.4032
2.9821 46500 0.4072
3.0 46779 -
3.0142 47000 0.4137
3.0462 47500 0.4151
3.0783 48000 0.3959
3.1104 48500 0.3808
3.1424 49000 0.3701
3.1745 49500 0.3716
3.2066 50000 0.387
3.2386 50500 0.3747
3.2707 51000 0.3488
3.3028 51500 0.3795
3.3348 52000 0.3511
3.3669 52500 0.3469
3.3990 53000 0.3475
3.4310 53500 0.3669
3.4631 54000 0.3428
3.4952 54500 0.3597
3.5272 55000 0.3525
3.5593 55500 0.3502
3.5914 56000 0.3446
3.6234 56500 0.3563
3.6555 57000 0.34
3.6876 57500 0.3385
3.7196 58000 0.335
3.7517 58500 0.3344
3.7837 59000 0.3361
3.8158 59500 0.3285
3.8479 60000 0.3429
3.8799 60500 0.3162
3.9120 61000 0.3279
3.9441 61500 0.3448
3.9761 62000 0.322
4.0 62372 -
4.0082 62500 0.3356
4.0403 63000 0.3416
4.0723 63500 0.3195
4.1044 64000 0.3033
4.1365 64500 0.2957
4.1685 65000 0.312
4.2006 65500 0.3135
4.2327 66000 0.3193
4.2647 66500 0.2919
4.2968 67000 0.3078
4.3289 67500 0.302
4.3609 68000 0.2973
4.3930 68500 0.2725
4.4251 69000 0.3013
4.4571 69500 0.2936
4.4892 70000 0.3009
4.5213 70500 0.2941
4.5533 71000 0.2957
4.5854 71500 0.288
4.6175 72000 0.3032
4.6495 72500 0.2919
4.6816 73000 0.2843
4.7137 73500 0.2862
4.7457 74000 0.2789
4.7778 74500 0.2843
4.8099 75000 0.2816
4.8419 75500 0.2813
4.8740 76000 0.2839
4.9060 76500 0.2619
4.9381 77000 0.2877
4.9702 77500 0.2693
5.0 77965 -
5.0022 78000 0.2738
5.0343 78500 0.286
5.0664 79000 0.2754
5.0984 79500 0.2561
5.1305 80000 0.2498
5.1626 80500 0.2563
5.1946 81000 0.2618
5.2267 81500 0.265
5.2588 82000 0.245
5.2908 82500 0.2551
5.3229 83000 0.2653
5.3550 83500 0.2453
5.3870 84000 0.24
5.4191 84500 0.2478
5.4512 85000 0.2444
5.4832 85500 0.2464
5.5153 86000 0.2327
5.5474 86500 0.2376
5.5794 87000 0.2469
5.6115 87500 0.2488
5.6436 88000 0.2467
5.6756 88500 0.2409
5.7077 89000 0.2287
5.7398 89500 0.2288
5.7718 90000 0.2399
5.8039 90500 0.2341
5.8360 91000 0.2352
5.8680 91500 0.2196
5.9001 92000 0.2196
5.9321 92500 0.2246
5.9642 93000 0.2411
5.9963 93500 0.2279
6.0 93558 -
6.0283 94000 0.2489
6.0604 94500 0.2339
6.0925 95000 0.224
6.1245 95500 0.209
6.1566 96000 0.2262
6.1887 96500 0.2221
6.2207 97000 0.214
6.2528 97500 0.21
6.2849 98000 0.2072
6.3169 98500 0.2204
6.3490 99000 0.2041
6.3811 99500 0.2067
6.4131 100000 0.2102
6.4452 100500 0.2031
6.4773 101000 0.2107
6.5093 101500 0.2009
6.5414 102000 0.2057
6.5735 102500 0.1979
6.6055 103000 0.1994
6.6376 103500 0.2065
6.6697 104000 0.1958
6.7017 104500 0.2074
6.7338 105000 0.1941
6.7659 105500 0.2035
6.7979 106000 0.2003
6.8300 106500 0.2083
6.8621 107000 0.1921
6.8941 107500 0.1893
6.9262 108000 0.2014
6.9583 108500 0.192
6.9903 109000 0.1921
7.0 109151 -
7.0224 109500 0.2141
7.0544 110000 0.1868
7.0865 110500 0.1815
7.1186 111000 0.1793
7.1506 111500 0.1812
7.1827 112000 0.1853
7.2148 112500 0.1922
7.2468 113000 0.179
7.2789 113500 0.1707
7.3110 114000 0.1829
7.3430 114500 0.1743
7.3751 115000 0.1787
7.4072 115500 0.1815
7.4392 116000 0.1776
7.4713 116500 0.1773
7.5034 117000 0.1753
7.5354 117500 0.1816
7.5675 118000 0.1795
7.5996 118500 0.178
7.6316 119000 0.177
7.6637 119500 0.175
7.6958 120000 0.1701
7.7278 120500 0.1686
7.7599 121000 0.1727
7.7920 121500 0.1733
7.8240 122000 0.1707
7.8561 122500 0.1729
7.8882 123000 0.1569
7.9202 123500 0.1657
7.9523 124000 0.1773
7.9844 124500 0.1625
8.0 124744 -
8.0164 125000 0.1824
8.0485 125500 0.1852
8.0805 126000 0.1701
8.1126 126500 0.1573
8.1447 127000 0.1614
8.1767 127500 0.1624
8.2088 128000 0.1575
8.2409 128500 0.1481
8.2729 129000 0.1537
8.3050 129500 0.1616
8.3371 130000 0.1544
8.3691 130500 0.1511
8.4012 131000 0.1569
8.4333 131500 0.1535
8.4653 132000 0.1489
8.4974 132500 0.1593
8.5295 133000 0.1552
8.5615 133500 0.1578
8.5936 134000 0.1501
8.6257 134500 0.156
8.6577 135000 0.1455
8.6898 135500 0.1524
8.7219 136000 0.1344
8.7539 136500 0.1513
8.7860 137000 0.141
8.8181 137500 0.1518
8.8501 138000 0.1468
8.8822 138500 0.1416
8.9143 139000 0.1434
8.9463 139500 0.1495
8.9784 140000 0.1364
9.0 140337 -
9.0105 140500 0.1507
9.0425 141000 0.1496
9.0746 141500 0.1475
9.1067 142000 0.1348
9.1387 142500 0.1282
9.1708 143000 0.1362
9.2028 143500 0.1364
9.2349 144000 0.1385
9.2670 144500 0.1309
9.2990 145000 0.1324
9.3311 145500 0.1354
9.3632 146000 0.1283
9.3952 146500 0.1239
9.4273 147000 0.126
9.4594 147500 0.1232
9.4914 148000 0.1269
9.5235 148500 0.1269
9.5556 149000 0.1299
9.5876 149500 0.1367
9.6197 150000 0.1354
9.6518 150500 0.1239
9.6838 151000 0.1311
9.7159 151500 0.1235
9.7480 152000 0.129
9.7800 152500 0.1244
9.8121 153000 0.1201
9.8442 153500 0.1332
9.8762 154000 0.1189
9.9083 154500 0.1221
9.9404 155000 0.1228
9.9724 155500 0.1173
10.0 155930 -
10.0045 156000 0.1347
10.0366 156500 0.1384
10.0686 157000 0.1402
10.1007 157500 0.1161
10.1328 158000 0.1141
10.1648 158500 0.1199
10.1969 159000 0.1328
10.2289 159500 0.1263
10.2610 160000 0.1143
10.2931 160500 0.1207
10.3251 161000 0.1119
10.3572 161500 0.114
10.3893 162000 0.114
10.4213 162500 0.1118
10.4534 163000 0.1228
10.4855 163500 0.1209
10.5175 164000 0.1153
10.5496 164500 0.118
10.5817 165000 0.1118
10.6137 165500 0.1206
10.6458 166000 0.1108
10.6779 166500 0.1084
10.7099 167000 0.1127
10.7420 167500 0.1001
10.7741 168000 0.1073
10.8061 168500 0.1174
10.8382 169000 0.1143
10.8703 169500 0.1158
10.9023 170000 0.1099
10.9344 170500 0.0998
10.9665 171000 0.1009
10.9985 171500 0.1167
11.0 171523 -
11.0306 172000 0.1161
11.0627 172500 0.1126
11.0947 173000 0.1046
11.1268 173500 0.1054
11.1589 174000 0.1063
11.1909 174500 0.1136
11.2230 175000 0.108
11.2551 175500 0.1014
11.2871 176000 0.1036
11.3192 176500 0.1043
11.3512 177000 0.0973
11.3833 177500 0.0934
11.4154 178000 0.095
11.4474 178500 0.1032
11.4795 179000 0.1089
11.5116 179500 0.098
11.5436 180000 0.099
11.5757 180500 0.1007
11.6078 181000 0.096
11.6398 181500 0.0986
11.6719 182000 0.1033
11.7040 182500 0.0899
11.7360 183000 0.0946
11.7681 183500 0.0943
11.8002 184000 0.0954
11.8322 184500 0.0955
11.8643 185000 0.0924
11.8964 185500 0.0847
11.9284 186000 0.0914
11.9605 186500 0.0918
11.9926 187000 0.099
12.0 187116 -
12.0246 187500 0.1029
12.0567 188000 0.1032
12.0888 188500 0.0864
12.1208 189000 0.0921
12.1529 189500 0.0959
12.1850 190000 0.0846
12.2170 190500 0.0924
12.2491 191000 0.0897
12.2812 191500 0.0858
12.3132 192000 0.0851
12.3453 192500 0.0925
12.3773 193000 0.0963
12.4094 193500 0.0867
12.4415 194000 0.0929
12.4735 194500 0.0904
12.5056 195000 0.0854
12.5377 195500 0.0876
12.5697 196000 0.0899
12.6018 196500 0.09
12.6339 197000 0.0921
12.6659 197500 0.0829
12.6980 198000 0.0952
12.7301 198500 0.087
12.7621 199000 0.086
12.7942 199500 0.0836
12.8263 200000 0.0845
12.8583 200500 0.0808
12.8904 201000 0.0771
12.9225 201500 0.0815
12.9545 202000 0.0901
12.9866 202500 0.0871
13.0 202709 -
13.0187 203000 0.088
13.0507 203500 0.089
13.0828 204000 0.081
13.1149 204500 0.0739
13.1469 205000 0.0825
13.1790 205500 0.0855
13.2111 206000 0.0788
13.2431 206500 0.0769
13.2752 207000 0.0706
13.3073 207500 0.0821
13.3393 208000 0.0752
13.3714 208500 0.0746
13.4035 209000 0.066
13.4355 209500 0.0779
13.4676 210000 0.0755
13.4996 210500 0.0829
13.5317 211000 0.0731
13.5638 211500 0.086
13.5958 212000 0.078
13.6279 212500 0.0724
13.6600 213000 0.0696
13.6920 213500 0.0789
13.7241 214000 0.0657
13.7562 214500 0.0767
13.7882 215000 0.0728
13.8203 215500 0.071
13.8524 216000 0.0733
13.8844 216500 0.0621
13.9165 217000 0.0677
13.9486 217500 0.0761
13.9806 218000 0.0669
14.0 218302 -
14.0127 218500 0.0848
14.0448 219000 0.0647
14.0768 219500 0.0717
14.1089 220000 0.0653
14.1410 220500 0.0615
14.1730 221000 0.0711
14.2051 221500 0.0674
14.2372 222000 0.0674
14.2692 222500 0.0657
14.3013 223000 0.0727
14.3334 223500 0.0709
14.3654 224000 0.061
14.3975 224500 0.0638
14.4296 225000 0.0704
14.4616 225500 0.0623
14.4937 226000 0.065
14.5257 226500 0.0657
14.5578 227000 0.0634
14.5899 227500 0.0555
14.6219 228000 0.0647
14.6540 228500 0.0616
14.6861 229000 0.0645
14.7181 229500 0.0649
14.7502 230000 0.0612
14.7823 230500 0.0646
14.8143 231000 0.0571
14.8464 231500 0.0561
14.8785 232000 0.0598
14.9105 232500 0.0634
14.9426 233000 0.0657
14.9747 233500 0.0644
15.0 233895 -

Framework Versions

  • Python: 3.10.12
  • Sentence Transformers: 3.1.1
  • Transformers: 4.44.2
  • PyTorch: 2.4.1+cu121
  • Accelerate: 0.34.2
  • Datasets: 3.0.0
  • Tokenizers: 0.19.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}
Downloads last month
0
Safetensors
Model size
278M params
Tensor type
F32
·
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Model tree for BlackBeenie/mdeberta-v3-base-msmarco-v3-bpr

Finetuned
this model