CalebR84 commited on
Commit
a22f7b6
·
verified ·
1 Parent(s): 90762da

Add new SentenceTransformer model

Browse files
Files changed (2) hide show
  1. README.md +291 -123
  2. model.safetensors +1 -1
README.md CHANGED
@@ -10,38 +10,34 @@ tags:
10
  - loss:OnlineContrastiveLoss
11
  base_model: sentence-transformers/stsb-distilbert-base
12
  widget:
13
- - source_sentence: How much does James Bond earn?
14
  sentences:
15
- - What are the top three things that influenced the way women treat men in their
16
- life?
17
- - Why does James Bond say his country is England?
18
- - Which Royal Enfield bike is best?
19
- - source_sentence: What are the safety precautions on handling shotguns proposed by
20
- the NRA in Idaho?
21
  sentences:
22
- - What are the safety precautions on handling shotguns ?
23
- - Do Muslims believe in the second coming of Jesus?
24
- - How do I go from 20% body fat to 10% body fat in 3 months?
25
- - source_sentence: Which is the best song you ever listened?
 
 
26
  sentences:
27
- - What happens if a tourist can't pay the bill of giving birth in the USA and Canada?
28
- - What is the best song to listen to after a stressful day at work?
29
- - My best friend is very unhappy and I really want to help her. She recently told
30
- me that she has never been happy in her entire life. What can I do?
31
- - source_sentence: Does the medical profession know the cure to cancer?
32
  sentences:
33
- - Can we cure cancer?
34
- - What is the history of the Glassboro train station, and how does it compare to
35
- Seaforth station?
36
- - Is Angular JS SEO friendly?
37
- - source_sentence: Which equation in general relativity predicted the existence of
38
- black-holes ?
39
  sentences:
40
- - What would be the best way to build a gaming laptop?
41
- - How the black money be recovered by simultaneously demonetising 500, 1000 notes
42
- and introducing 500 , 2000 notes?
43
- - Why do a lot of theists and agnostics confuse mainstream atheistic thought with
44
- "positive atheism"?
45
  datasets:
46
  - sentence-transformers/quora-duplicates
47
  pipeline_tag: sentence-similarity
@@ -89,25 +85,25 @@ model-index:
89
  value: 0.866
90
  name: Cosine Accuracy
91
  - type: cosine_accuracy_threshold
92
- value: 0.828514814376831
93
  name: Cosine Accuracy Threshold
94
  - type: cosine_f1
95
- value: 0.8169398907103825
96
  name: Cosine F1
97
  - type: cosine_f1_threshold
98
- value: 0.8073546290397644
99
  name: Cosine F1 Threshold
100
  - type: cosine_precision
101
- value: 0.7931034482758621
102
  name: Cosine Precision
103
  - type: cosine_recall
104
- value: 0.8422535211267606
105
  name: Cosine Recall
106
  - type: cosine_ap
107
- value: 0.8557930397486502
108
  name: Cosine Ap
109
  - type: cosine_mcc
110
- value: 0.7122120043040523
111
  name: Cosine Mcc
112
  - task:
113
  type: paraphrase-mining
@@ -117,19 +113,19 @@ model-index:
117
  type: quora-duplicates-dev
118
  metrics:
119
  - type: average_precision
120
- value: 0.5405378071329516
121
  name: Average Precision
122
  - type: f1
123
- value: 0.552654246548372
124
  name: F1
125
  - type: precision
126
- value: 0.5446255335661622
127
  name: Precision
128
  - type: recall
129
- value: 0.560923215267023
130
  name: Recall
131
  - type: threshold
132
- value: 0.8690851628780365
133
  name: Threshold
134
  - task:
135
  type: information-retrieval
@@ -139,49 +135,49 @@ model-index:
139
  type: unknown
140
  metrics:
141
  - type: cosine_accuracy@1
142
- value: 0.9266
143
  name: Cosine Accuracy@1
144
  - type: cosine_accuracy@3
145
- value: 0.9708
146
  name: Cosine Accuracy@3
147
  - type: cosine_accuracy@5
148
- value: 0.9794
149
  name: Cosine Accuracy@5
150
  - type: cosine_accuracy@10
151
- value: 0.9864
152
  name: Cosine Accuracy@10
153
  - type: cosine_precision@1
154
- value: 0.9266
155
  name: Cosine Precision@1
156
  - type: cosine_precision@3
157
- value: 0.4151333333333333
158
  name: Cosine Precision@3
159
  - type: cosine_precision@5
160
- value: 0.2672
161
  name: Cosine Precision@5
162
  - type: cosine_precision@10
163
- value: 0.14192000000000002
164
  name: Cosine Precision@10
165
  - type: cosine_recall@1
166
- value: 0.7976201631538398
167
  name: Cosine Recall@1
168
  - type: cosine_recall@3
169
- value: 0.9330406134486428
170
  name: Cosine Recall@3
171
  - type: cosine_recall@5
172
- value: 0.9575996539310363
173
  name: Cosine Recall@5
174
  - type: cosine_recall@10
175
- value: 0.976244254249019
176
  name: Cosine Recall@10
177
  - type: cosine_ndcg@10
178
- value: 0.9511213058217592
179
  name: Cosine Ndcg@10
180
  - type: cosine_mrr@10
181
- value: 0.9498832539682537
182
  name: Cosine Mrr@10
183
  - type: cosine_map@100
184
- value: 0.9385461511529752
185
  name: Cosine Map@100
186
  ---
187
 
@@ -235,9 +231,9 @@ from sentence_transformers import SentenceTransformer
235
  model = SentenceTransformer("CalebR84/stsb-distilbert-base-ocl")
236
  # Run inference
237
  sentences = [
238
- 'Which equation in general relativity predicted the existence of black-holes ?',
239
- 'Why do a lot of theists and agnostics confuse mainstream atheistic thought with "positive atheism"?',
240
- 'How the black money be recovered by simultaneously demonetising 500, 1000 notes and introducing 500 , 2000 notes?',
241
  ]
242
  embeddings = model.encode(sentences)
243
  print(embeddings.shape)
@@ -285,29 +281,29 @@ You can finetune this model on your own dataset.
285
  | Metric | Value |
286
  |:--------------------------|:-----------|
287
  | cosine_accuracy | 0.866 |
288
- | cosine_accuracy_threshold | 0.8285 |
289
- | cosine_f1 | 0.8169 |
290
- | cosine_f1_threshold | 0.8074 |
291
- | cosine_precision | 0.7931 |
292
- | cosine_recall | 0.8423 |
293
- | **cosine_ap** | **0.8558** |
294
- | cosine_mcc | 0.7122 |
295
 
296
  #### Paraphrase Mining
297
 
298
  * Dataset: `quora-duplicates-dev`
299
  * Evaluated with [<code>ParaphraseMiningEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.ParaphraseMiningEvaluator) with these parameters:
300
  ```json
301
- {'add_transitive_closure': <function ParaphraseMiningEvaluator.add_transitive_closure at 0x00000134CA2AF380>, 'max_pairs': 500000, 'top_k': 100}
302
  ```
303
 
304
  | Metric | Value |
305
  |:----------------------|:-----------|
306
- | **average_precision** | **0.5405** |
307
- | f1 | 0.5527 |
308
- | precision | 0.5446 |
309
- | recall | 0.5609 |
310
- | threshold | 0.8691 |
311
 
312
  #### Information Retrieval
313
 
@@ -315,21 +311,21 @@ You can finetune this model on your own dataset.
315
 
316
  | Metric | Value |
317
  |:--------------------|:-----------|
318
- | cosine_accuracy@1 | 0.9266 |
319
- | cosine_accuracy@3 | 0.9708 |
320
- | cosine_accuracy@5 | 0.9794 |
321
- | cosine_accuracy@10 | 0.9864 |
322
- | cosine_precision@1 | 0.9266 |
323
- | cosine_precision@3 | 0.4151 |
324
- | cosine_precision@5 | 0.2672 |
325
- | cosine_precision@10 | 0.1419 |
326
- | cosine_recall@1 | 0.7976 |
327
- | cosine_recall@3 | 0.933 |
328
- | cosine_recall@5 | 0.9576 |
329
- | cosine_recall@10 | 0.9762 |
330
- | **cosine_ndcg@10** | **0.9511** |
331
- | cosine_mrr@10 | 0.9499 |
332
- | cosine_map@100 | 0.9385 |
333
 
334
  <!--
335
  ## Bias, Risks and Limitations
@@ -353,16 +349,16 @@ You can finetune this model on your own dataset.
353
  * Size: 100,000 training samples
354
  * Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>label</code>
355
  * Approximate statistics based on the first 1000 samples:
356
- | | sentence1 | sentence2 | label |
357
- |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:------------------------------------------------|
358
- | type | string | string | int |
359
- | details | <ul><li>min: 5 tokens</li><li>mean: 15.79 tokens</li><li>max: 57 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 16.14 tokens</li><li>max: 128 tokens</li></ul> | <ul><li>0: ~63.80%</li><li>1: ~36.20%</li></ul> |
360
  * Samples:
361
- | sentence1 | sentence2 | label |
362
- |:-----------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------|:---------------|
363
- | <code>I heard there is a website where I can get an EIN number for free. Which website is it?</code> | <code>How do you find an EIN number fora company?</code> | <code>0</code> |
364
- | <code>How can I get funding from a venture capital firm?</code> | <code>Chamath Palihapitiya: How do I get venture capital or crowd funding for my startup?</code> | <code>1</code> |
365
- | <code>I'm working on my debut novel now. What is the best way to make it reach the people?</code> | <code>What is the best way to describe eyes in a novel?</code> | <code>0</code> |
366
  * Loss: [<code>OnlineContrastiveLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#onlinecontrastiveloss)
367
 
368
  ### Evaluation Dataset
@@ -376,13 +372,13 @@ You can finetune this model on your own dataset.
376
  | | sentence1 | sentence2 | label |
377
  |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:------------------------------------------------|
378
  | type | string | string | int |
379
- | details | <ul><li>min: 6 tokens</li><li>mean: 15.31 tokens</li><li>max: 62 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 15.9 tokens</li><li>max: 128 tokens</li></ul> | <ul><li>0: ~64.50%</li><li>1: ~35.50%</li></ul> |
380
  * Samples:
381
- | sentence1 | sentence2 | label |
382
- |:---------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------|:---------------|
383
- | <code>Local authentication versus jira user management?</code> | <code>How do I deal with shitty friend?</code> | <code>0</code> |
384
- | <code>Who invented the glass?</code> | <code>Who invented the hour glass?</code> | <code>0</code> |
385
- | <code>What are some good gifts I can give my parents on their 25th anniversary?</code> | <code>What are some of the little things that we can do/plan for our parents on their 25th Marriage anniversary?</code> | <code>0</code> |
386
  * Loss: [<code>OnlineContrastiveLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#onlinecontrastiveloss)
387
 
388
  ### Training Hyperparameters
@@ -391,7 +387,7 @@ You can finetune this model on your own dataset.
391
  - `eval_strategy`: steps
392
  - `per_device_train_batch_size`: 64
393
  - `per_device_eval_batch_size`: 64
394
- - `num_train_epochs`: 1
395
  - `warmup_ratio`: 0.1
396
  - `fp16`: True
397
  - `batch_sampler`: no_duplicates
@@ -416,7 +412,7 @@ You can finetune this model on your own dataset.
416
  - `adam_beta2`: 0.999
417
  - `adam_epsilon`: 1e-08
418
  - `max_grad_norm`: 1.0
419
- - `num_train_epochs`: 1
420
  - `max_steps`: -1
421
  - `lr_scheduler_type`: linear
422
  - `lr_scheduler_kwargs`: {}
@@ -517,28 +513,200 @@ You can finetune this model on your own dataset.
517
  </details>
518
 
519
  ### Training Logs
520
- | Epoch | Step | Training Loss | Validation Loss | quora-duplicates_cosine_ap | quora-duplicates-dev_average_precision | cosine_ndcg@10 |
521
- |:------:|:----:|:-------------:|:---------------:|:--------------------------:|:--------------------------------------:|:--------------:|
522
- | 0 | 0 | - | - | 0.7296 | 0.4200 | 0.9409 |
523
- | 0.0640 | 100 | 2.5264 | - | - | - | - |
524
- | 0.1280 | 200 | 2.1982 | - | - | - | - |
525
- | 0.1599 | 250 | - | 1.9589 | 0.8052 | 0.4491 | 0.9369 |
526
- | 0.1919 | 300 | 1.9158 | - | - | - | - |
527
- | 0.2559 | 400 | 1.8314 | - | - | - | - |
528
- | 0.3199 | 500 | 1.7882 | 1.8040 | 0.8232 | 0.4747 | 0.9352 |
529
- | 0.3839 | 600 | 1.7782 | - | - | - | - |
530
- | 0.4479 | 700 | 1.7213 | - | - | - | - |
531
- | 0.4798 | 750 | - | 1.7173 | 0.8403 | 0.5178 | 0.9431 |
532
- | 0.5118 | 800 | 1.7254 | - | - | - | - |
533
- | 0.5758 | 900 | 1.7004 | - | - | - | - |
534
- | 0.6398 | 1000 | 1.6938 | 1.6084 | 0.8520 | 0.5187 | 0.9488 |
535
- | 0.7038 | 1100 | 1.6818 | - | - | - | - |
536
- | 0.7678 | 1200 | 1.6595 | - | - | - | - |
537
- | 0.7997 | 1250 | - | 1.5768 | 0.8558 | 0.5308 | 0.9473 |
538
- | 0.8317 | 1300 | 1.5933 | - | - | - | - |
539
- | 0.8957 | 1400 | 1.5947 | - | - | - | - |
540
- | 0.9597 | 1500 | 1.6612 | 1.5573 | 0.8558 | 0.5405 | 0.9511 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
541
 
 
542
 
543
  ### Framework Versions
544
  - Python: 3.12.9
 
10
  - loss:OnlineContrastiveLoss
11
  base_model: sentence-transformers/stsb-distilbert-base
12
  widget:
13
+ - source_sentence: Can I retrieve my deleted text messages on my LG phone?
14
  sentences:
15
+ - Why do we sleep?
16
+ - How do I recover a deleted text message from my phone without a computer?
17
+ - What are subjects to study in upsc?
18
+ - source_sentence: How can I prepare for IPS?
 
 
19
  sentences:
20
+ - What should I prepare for ips?
21
+ - I am trying to find a meaning to life, to give a purpose to my life. Is there
22
+ any book that can help me find my answer, or at least give me the tools?
23
+ - What are the health benefits of Turmeric?
24
+ - source_sentence: Which is the best game development laptop for ₹60,000 to ₹70,000
25
+ INR?
26
  sentences:
27
+ - Why doesn't Palestine appear on Google Maps as of 2016?
28
+ - Which is the best laptop for game development under ₹70,000 INR?
29
+ - What is meant by judicial review in the context of the Indian Judiciary?
30
+ - source_sentence: Although light beam bouncing between two plates inside a clock
31
+ is often used to explain time dilation, how can other practical cases be explained?
32
  sentences:
33
+ - Is Run Ze Cao's falsification of Einstein's relativity valid?
34
+ - If India denies Pakistan water, will Pakistan give up its nuclear weapons?
35
+ - How do I revise class 12 syllabus in 1 month?
36
+ - source_sentence: How can I lose weight quickly? Need serious help.
 
 
37
  sentences:
38
+ - Which is the best romantic movie?
39
+ - Why are there so many half-built, abandoned buildings in Mexico?
40
+ - How can you lose weight really quick?
 
 
41
  datasets:
42
  - sentence-transformers/quora-duplicates
43
  pipeline_tag: sentence-similarity
 
85
  value: 0.866
86
  name: Cosine Accuracy
87
  - type: cosine_accuracy_threshold
88
+ value: 0.7860240340232849
89
  name: Cosine Accuracy Threshold
90
  - type: cosine_f1
91
+ value: 0.8320802005012532
92
  name: Cosine F1
93
  - type: cosine_f1_threshold
94
+ value: 0.7848798036575317
95
  name: Cosine F1 Threshold
96
  - type: cosine_precision
97
+ value: 0.7811764705882352
98
  name: Cosine Precision
99
  - type: cosine_recall
100
+ value: 0.8900804289544236
101
  name: Cosine Recall
102
  - type: cosine_ap
103
+ value: 0.8772887253419398
104
  name: Cosine Ap
105
  - type: cosine_mcc
106
+ value: 0.7256385093029618
107
  name: Cosine Mcc
108
  - task:
109
  type: paraphrase-mining
 
113
  type: quora-duplicates-dev
114
  metrics:
115
  - type: average_precision
116
+ value: 0.6392503009812087
117
  name: Average Precision
118
  - type: f1
119
+ value: 0.6435291762586327
120
  name: F1
121
  - type: precision
122
+ value: 0.644658344613225
123
  name: Precision
124
  - type: recall
125
+ value: 0.6424039566368587
126
  name: Recall
127
  - type: threshold
128
+ value: 0.8726956844329834
129
  name: Threshold
130
  - task:
131
  type: information-retrieval
 
135
  type: unknown
136
  metrics:
137
  - type: cosine_accuracy@1
138
+ value: 0.9172
139
  name: Cosine Accuracy@1
140
  - type: cosine_accuracy@3
141
+ value: 0.9588
142
  name: Cosine Accuracy@3
143
  - type: cosine_accuracy@5
144
+ value: 0.9672
145
  name: Cosine Accuracy@5
146
  - type: cosine_accuracy@10
147
+ value: 0.9762
148
  name: Cosine Accuracy@10
149
  - type: cosine_precision@1
150
+ value: 0.9172
151
  name: Cosine Precision@1
152
  - type: cosine_precision@3
153
+ value: 0.4102
154
  name: Cosine Precision@3
155
  - type: cosine_precision@5
156
+ value: 0.2644
157
  name: Cosine Precision@5
158
  - type: cosine_precision@10
159
+ value: 0.14058
160
  name: Cosine Precision@10
161
  - type: cosine_recall@1
162
+ value: 0.7868590910037675
163
  name: Cosine Recall@1
164
  - type: cosine_recall@3
165
+ value: 0.91981069059372
166
  name: Cosine Recall@3
167
  - type: cosine_recall@5
168
+ value: 0.9442488336402158
169
  name: Cosine Recall@5
170
  - type: cosine_recall@10
171
+ value: 0.9641439212486859
172
  name: Cosine Recall@10
173
  - type: cosine_ndcg@10
174
+ value: 0.9388257874901692
175
  name: Cosine Ndcg@10
176
  - type: cosine_mrr@10
177
+ value: 0.9393049206349205
178
  name: Cosine Mrr@10
179
  - type: cosine_map@100
180
+ value: 0.9258332306777016
181
  name: Cosine Map@100
182
  ---
183
 
 
231
  model = SentenceTransformer("CalebR84/stsb-distilbert-base-ocl")
232
  # Run inference
233
  sentences = [
234
+ 'How can I lose weight quickly? Need serious help.',
235
+ 'How can you lose weight really quick?',
236
+ 'Why are there so many half-built, abandoned buildings in Mexico?',
237
  ]
238
  embeddings = model.encode(sentences)
239
  print(embeddings.shape)
 
281
  | Metric | Value |
282
  |:--------------------------|:-----------|
283
  | cosine_accuracy | 0.866 |
284
+ | cosine_accuracy_threshold | 0.786 |
285
+ | cosine_f1 | 0.8321 |
286
+ | cosine_f1_threshold | 0.7849 |
287
+ | cosine_precision | 0.7812 |
288
+ | cosine_recall | 0.8901 |
289
+ | **cosine_ap** | **0.8773** |
290
+ | cosine_mcc | 0.7256 |
291
 
292
  #### Paraphrase Mining
293
 
294
  * Dataset: `quora-duplicates-dev`
295
  * Evaluated with [<code>ParaphraseMiningEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.ParaphraseMiningEvaluator) with these parameters:
296
  ```json
297
+ {'add_transitive_closure': <function ParaphraseMiningEvaluator.add_transitive_closure at 0x00000219B2FE09A0>, 'max_pairs': 500000, 'top_k': 100}
298
  ```
299
 
300
  | Metric | Value |
301
  |:----------------------|:-----------|
302
+ | **average_precision** | **0.6393** |
303
+ | f1 | 0.6435 |
304
+ | precision | 0.6447 |
305
+ | recall | 0.6424 |
306
+ | threshold | 0.8727 |
307
 
308
  #### Information Retrieval
309
 
 
311
 
312
  | Metric | Value |
313
  |:--------------------|:-----------|
314
+ | cosine_accuracy@1 | 0.9172 |
315
+ | cosine_accuracy@3 | 0.9588 |
316
+ | cosine_accuracy@5 | 0.9672 |
317
+ | cosine_accuracy@10 | 0.9762 |
318
+ | cosine_precision@1 | 0.9172 |
319
+ | cosine_precision@3 | 0.4102 |
320
+ | cosine_precision@5 | 0.2644 |
321
+ | cosine_precision@10 | 0.1406 |
322
+ | cosine_recall@1 | 0.7869 |
323
+ | cosine_recall@3 | 0.9198 |
324
+ | cosine_recall@5 | 0.9442 |
325
+ | cosine_recall@10 | 0.9641 |
326
+ | **cosine_ndcg@10** | **0.9388** |
327
+ | cosine_mrr@10 | 0.9393 |
328
+ | cosine_map@100 | 0.9258 |
329
 
330
  <!--
331
  ## Bias, Risks and Limitations
 
349
  * Size: 100,000 training samples
350
  * Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>label</code>
351
  * Approximate statistics based on the first 1000 samples:
352
+ | | sentence1 | sentence2 | label |
353
+ |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:------------------------------------------------|
354
+ | type | string | string | int |
355
+ | details | <ul><li>min: 6 tokens</li><li>mean: 15.56 tokens</li><li>max: 62 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 15.73 tokens</li><li>max: 84 tokens</li></ul> | <ul><li>0: ~63.20%</li><li>1: ~36.80%</li></ul> |
356
  * Samples:
357
+ | sentence1 | sentence2 | label |
358
+ |:---------------------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------|:---------------|
359
+ | <code>What are some of the greatest books not adapted into film yet?</code> | <code>What book should be made into a movie?</code> | <code>0</code> |
360
+ | <code>How can I increase my communication skills?</code> | <code>How we improve our communication skills?</code> | <code>1</code> |
361
+ | <code>Heymen I have a note5 it give me this message when a turn it on and shout down (custom pinary are blocked by frp lock) I try odin and kies butnot work?</code> | <code>Setup dubbing studio with very less budget in India?</code> | <code>0</code> |
362
  * Loss: [<code>OnlineContrastiveLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#onlinecontrastiveloss)
363
 
364
  ### Evaluation Dataset
 
372
  | | sentence1 | sentence2 | label |
373
  |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:------------------------------------------------|
374
  | type | string | string | int |
375
+ | details | <ul><li>min: 3 tokens</li><li>mean: 15.37 tokens</li><li>max: 62 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 15.63 tokens</li><li>max: 78 tokens</li></ul> | <ul><li>0: ~62.70%</li><li>1: ~37.30%</li></ul> |
376
  * Samples:
377
+ | sentence1 | sentence2 | label |
378
+ |:------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------|:---------------|
379
+ | <code>Which is the best book to learn data structures and algorithms?</code> | <code>Which book is the best book for algorithm and datastructure?</code> | <code>1</code> |
380
+ | <code>Does modafinil shows up on a drug test? Because my urine smells a lot of medicine?</code> | <code>Can Modafinil come out in a drug test?</code> | <code>0</code> |
381
+ | <code>Does the size of a penis matter?</code> | <code>Does penis size matters for girls?</code> | <code>1</code> |
382
  * Loss: [<code>OnlineContrastiveLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#onlinecontrastiveloss)
383
 
384
  ### Training Hyperparameters
 
387
  - `eval_strategy`: steps
388
  - `per_device_train_batch_size`: 64
389
  - `per_device_eval_batch_size`: 64
390
+ - `num_train_epochs`: 10
391
  - `warmup_ratio`: 0.1
392
  - `fp16`: True
393
  - `batch_sampler`: no_duplicates
 
412
  - `adam_beta2`: 0.999
413
  - `adam_epsilon`: 1e-08
414
  - `max_grad_norm`: 1.0
415
+ - `num_train_epochs`: 10
416
  - `max_steps`: -1
417
  - `lr_scheduler_type`: linear
418
  - `lr_scheduler_kwargs`: {}
 
513
  </details>
514
 
515
  ### Training Logs
516
+ <details><summary>Click to expand</summary>
517
+
518
+ | Epoch | Step | Training Loss | Validation Loss | quora-duplicates_cosine_ap | quora-duplicates-dev_average_precision | cosine_ndcg@10 |
519
+ |:------:|:-----:|:-------------:|:---------------:|:--------------------------:|:--------------------------------------:|:--------------:|
520
+ | 0 | 0 | - | - | 0.6905 | 0.4200 | 0.9397 |
521
+ | 0.0640 | 100 | 2.6402 | - | - | - | - |
522
+ | 0.1280 | 200 | 2.4398 | - | - | - | - |
523
+ | 0.1599 | 250 | - | 2.4217 | 0.7392 | 0.4765 | 0.9426 |
524
+ | 0.1919 | 300 | 2.2461 | - | - | - | - |
525
+ | 0.2559 | 400 | 2.1433 | - | - | - | - |
526
+ | 0.3199 | 500 | 2.0417 | 2.1120 | 0.7970 | 0.4566 | 0.9429 |
527
+ | 0.3839 | 600 | 2.0441 | - | - | - | - |
528
+ | 0.4479 | 700 | 1.8907 | - | - | - | - |
529
+ | 0.4798 | 750 | - | 2.0011 | 0.8229 | 0.4820 | 0.9468 |
530
+ | 0.5118 | 800 | 1.8985 | - | - | - | - |
531
+ | 0.5758 | 900 | 1.7521 | - | - | - | - |
532
+ | 0.6398 | 1000 | 1.8888 | 1.8010 | 0.8382 | 0.4925 | 0.9425 |
533
+ | 0.7038 | 1100 | 1.8524 | - | - | - | - |
534
+ | 0.7678 | 1200 | 1.6956 | - | - | - | - |
535
+ | 0.7997 | 1250 | - | 1.8004 | 0.8438 | 0.4283 | 0.9336 |
536
+ | 0.8317 | 1300 | 1.7519 | - | - | - | - |
537
+ | 0.8957 | 1400 | 1.7515 | - | - | - | - |
538
+ | 0.9597 | 1500 | 1.7288 | 1.7434 | 0.8352 | 0.5050 | 0.9428 |
539
+ | 1.0237 | 1600 | 1.533 | - | - | - | - |
540
+ | 1.0877 | 1700 | 1.2543 | - | - | - | - |
541
+ | 1.1196 | 1750 | - | 1.7109 | 0.8514 | 0.5299 | 0.9415 |
542
+ | 1.1516 | 1800 | 1.3201 | - | - | - | - |
543
+ | 1.2156 | 1900 | 1.3309 | - | - | - | - |
544
+ | 1.2796 | 2000 | 1.3256 | 1.7111 | 0.8528 | 0.5138 | 0.9393 |
545
+ | 1.3436 | 2100 | 1.2865 | - | - | - | - |
546
+ | 1.4075 | 2200 | 1.2659 | - | - | - | - |
547
+ | 1.4395 | 2250 | - | 1.7974 | 0.8468 | 0.5320 | 0.9390 |
548
+ | 1.4715 | 2300 | 1.2601 | - | - | - | - |
549
+ | 1.5355 | 2400 | 1.3337 | - | - | - | - |
550
+ | 1.5995 | 2500 | 1.3319 | 1.6922 | 0.8575 | 0.5399 | 0.9416 |
551
+ | 1.6635 | 2600 | 1.3232 | - | - | - | - |
552
+ | 1.7274 | 2700 | 1.3684 | - | - | - | - |
553
+ | 1.7594 | 2750 | - | 1.5772 | 0.8581 | 0.5592 | 0.9484 |
554
+ | 1.7914 | 2800 | 1.2706 | - | - | - | - |
555
+ | 1.8554 | 2900 | 1.3186 | - | - | - | - |
556
+ | 1.9194 | 3000 | 1.2336 | 1.5423 | 0.8656 | 0.5749 | 0.9433 |
557
+ | 1.9834 | 3100 | 1.2193 | - | - | - | - |
558
+ | 2.0473 | 3200 | 0.868 | - | - | - | - |
559
+ | 2.0793 | 3250 | - | 1.6575 | 0.8632 | 0.5735 | 0.9395 |
560
+ | 2.1113 | 3300 | 0.6411 | - | - | - | - |
561
+ | 2.1753 | 3400 | 0.7127 | - | - | - | - |
562
+ | 2.2393 | 3500 | 0.7044 | 1.5778 | 0.8718 | 0.5823 | 0.9387 |
563
+ | 2.3033 | 3600 | 0.6299 | - | - | - | - |
564
+ | 2.3672 | 3700 | 0.7162 | - | - | - | - |
565
+ | 2.3992 | 3750 | - | 1.6300 | 0.8595 | 0.5936 | 0.9414 |
566
+ | 2.4312 | 3800 | 0.6642 | - | - | - | - |
567
+ | 2.4952 | 3900 | 0.6902 | - | - | - | - |
568
+ | 2.5592 | 4000 | 0.7959 | 1.6070 | 0.8637 | 0.6006 | 0.9363 |
569
+ | 2.6232 | 4100 | 0.7588 | - | - | - | - |
570
+ | 2.6871 | 4200 | 0.6925 | - | - | - | - |
571
+ | 2.7191 | 4250 | - | 1.6787 | 0.8682 | 0.6006 | 0.9411 |
572
+ | 2.7511 | 4300 | 0.7226 | - | - | - | - |
573
+ | 2.8151 | 4400 | 0.7507 | - | - | - | - |
574
+ | 2.8791 | 4500 | 0.7563 | 1.6040 | 0.8658 | 0.6061 | 0.9416 |
575
+ | 2.9431 | 4600 | 0.7737 | - | - | - | - |
576
+ | 3.0070 | 4700 | 0.6525 | - | - | - | - |
577
+ | 3.0390 | 4750 | - | 1.6782 | 0.8652 | 0.5983 | 0.9401 |
578
+ | 3.0710 | 4800 | 0.3831 | - | - | - | - |
579
+ | 3.1350 | 4900 | 0.297 | - | - | - | - |
580
+ | 3.1990 | 5000 | 0.3725 | 1.7229 | 0.8588 | 0.6175 | 0.9418 |
581
+ | 3.2630 | 5100 | 0.4142 | - | - | - | - |
582
+ | 3.3269 | 5200 | 0.4415 | - | - | - | - |
583
+ | 3.3589 | 5250 | - | 1.6564 | 0.8635 | 0.6026 | 0.9379 |
584
+ | 3.3909 | 5300 | 0.3729 | - | - | - | - |
585
+ | 3.4549 | 5400 | 0.4164 | - | - | - | - |
586
+ | 3.5189 | 5500 | 0.3668 | 1.5964 | 0.8677 | 0.6105 | 0.9358 |
587
+ | 3.5829 | 5600 | 0.4184 | - | - | - | - |
588
+ | 3.6468 | 5700 | 0.4311 | - | - | - | - |
589
+ | 3.6788 | 5750 | - | 1.6523 | 0.8680 | 0.6130 | 0.9365 |
590
+ | 3.7108 | 5800 | 0.4222 | - | - | - | - |
591
+ | 3.7748 | 5900 | 0.4302 | - | - | - | - |
592
+ | 3.8388 | 6000 | 0.428 | 1.6625 | 0.8674 | 0.6163 | 0.9370 |
593
+ | 3.9028 | 6100 | 0.3898 | - | - | - | - |
594
+ | 3.9667 | 6200 | 0.4255 | - | - | - | - |
595
+ | 3.9987 | 6250 | - | 1.6145 | 0.8680 | 0.6118 | 0.9347 |
596
+ | 4.0307 | 6300 | 0.3456 | - | - | - | - |
597
+ | 4.0947 | 6400 | 0.2265 | - | - | - | - |
598
+ | 4.1587 | 6500 | 0.1913 | 1.7208 | 0.8595 | 0.6339 | 0.9433 |
599
+ | 4.2226 | 6600 | 0.2258 | - | - | - | - |
600
+ | 4.2866 | 6700 | 0.2484 | - | - | - | - |
601
+ | 4.3186 | 6750 | - | 1.6286 | 0.8600 | 0.6313 | 0.9394 |
602
+ | 4.3506 | 6800 | 0.1977 | - | - | - | - |
603
+ | 4.4146 | 6900 | 0.2013 | - | - | - | - |
604
+ | 4.4786 | 7000 | 0.2351 | 1.6910 | 0.8651 | 0.6193 | 0.9401 |
605
+ | 4.5425 | 7100 | 0.2356 | - | - | - | - |
606
+ | 4.6065 | 7200 | 0.2542 | - | - | - | - |
607
+ | 4.6385 | 7250 | - | 1.6955 | 0.8643 | 0.6129 | 0.9357 |
608
+ | 4.6705 | 7300 | 0.2592 | - | - | - | - |
609
+ | 4.7345 | 7400 | 0.2585 | - | - | - | - |
610
+ | 4.7985 | 7500 | 0.2375 | 1.7593 | 0.8647 | 0.6143 | 0.9325 |
611
+ | 4.8624 | 7600 | 0.2506 | - | - | - | - |
612
+ | 4.9264 | 7700 | 0.2394 | - | - | - | - |
613
+ | 4.9584 | 7750 | - | 1.6051 | 0.8720 | 0.6213 | 0.9350 |
614
+ | 4.9904 | 7800 | 0.2374 | - | - | - | - |
615
+ | 5.0544 | 7900 | 0.1675 | - | - | - | - |
616
+ | 5.1184 | 8000 | 0.131 | 1.5864 | 0.8673 | 0.6201 | 0.9377 |
617
+ | 5.1823 | 8100 | 0.1308 | - | - | - | - |
618
+ | 5.2463 | 8200 | 0.1483 | - | - | - | - |
619
+ | 5.2783 | 8250 | - | 1.5976 | 0.8698 | 0.6136 | 0.9359 |
620
+ | 5.3103 | 8300 | 0.1413 | - | - | - | - |
621
+ | 5.3743 | 8400 | 0.1392 | - | - | - | - |
622
+ | 5.4383 | 8500 | 0.1464 | 1.5980 | 0.8661 | 0.6267 | 0.9346 |
623
+ | 5.5022 | 8600 | 0.1781 | - | - | - | - |
624
+ | 5.5662 | 8700 | 0.151 | - | - | - | - |
625
+ | 5.5982 | 8750 | - | 1.5343 | 0.8756 | 0.6245 | 0.9352 |
626
+ | 5.6302 | 8800 | 0.1568 | - | - | - | - |
627
+ | 5.6942 | 8900 | 0.1702 | - | - | - | - |
628
+ | 5.7582 | 9000 | 0.1362 | 1.7121 | 0.8675 | 0.6230 | 0.9362 |
629
+ | 5.8221 | 9100 | 0.1371 | - | - | - | - |
630
+ | 5.8861 | 9200 | 0.1381 | - | - | - | - |
631
+ | 5.9181 | 9250 | - | 1.6326 | 0.8671 | 0.6122 | 0.9302 |
632
+ | 5.9501 | 9300 | 0.1691 | - | - | - | - |
633
+ | 6.0141 | 9400 | 0.1701 | - | - | - | - |
634
+ | 6.0781 | 9500 | 0.0935 | 1.5705 | 0.8709 | 0.6066 | 0.9293 |
635
+ | 6.1420 | 9600 | 0.0852 | - | - | - | - |
636
+ | 6.2060 | 9700 | 0.0874 | - | - | - | - |
637
+ | 6.2380 | 9750 | - | 1.5643 | 0.8724 | 0.6061 | 0.9307 |
638
+ | 6.2700 | 9800 | 0.0889 | - | - | - | - |
639
+ | 6.3340 | 9900 | 0.0972 | - | - | - | - |
640
+ | 6.3980 | 10000 | 0.1011 | 1.5622 | 0.8736 | 0.6153 | 0.9328 |
641
+ | 6.4619 | 10100 | 0.0962 | - | - | - | - |
642
+ | 6.5259 | 10200 | 0.1259 | - | - | - | - |
643
+ | 6.5579 | 10250 | - | 1.5406 | 0.8687 | 0.6293 | 0.9373 |
644
+ | 6.5899 | 10300 | 0.0925 | - | - | - | - |
645
+ | 6.6539 | 10400 | 0.1138 | - | - | - | - |
646
+ | 6.7179 | 10500 | 0.0788 | 1.5450 | 0.8658 | 0.6226 | 0.9349 |
647
+ | 6.7818 | 10600 | 0.1112 | - | - | - | - |
648
+ | 6.8458 | 10700 | 0.0922 | - | - | - | - |
649
+ | 6.8778 | 10750 | - | 1.5063 | 0.8736 | 0.6245 | 0.9370 |
650
+ | 6.9098 | 10800 | 0.1173 | - | - | - | - |
651
+ | 6.9738 | 10900 | 0.1141 | - | - | - | - |
652
+ | 7.0377 | 11000 | 0.0637 | 1.5007 | 0.8741 | 0.6270 | 0.9379 |
653
+ | 7.1017 | 11100 | 0.0713 | - | - | - | - |
654
+ | 7.1657 | 11200 | 0.0754 | - | - | - | - |
655
+ | 7.1977 | 11250 | - | 1.5081 | 0.8725 | 0.6273 | 0.9376 |
656
+ | 7.2297 | 11300 | 0.04 | - | - | - | - |
657
+ | 7.2937 | 11400 | 0.0695 | - | - | - | - |
658
+ | 7.3576 | 11500 | 0.034 | 1.5598 | 0.8710 | 0.6179 | 0.9350 |
659
+ | 7.4216 | 11600 | 0.0513 | - | - | - | - |
660
+ | 7.4856 | 11700 | 0.0749 | - | - | - | - |
661
+ | 7.5176 | 11750 | - | 1.6118 | 0.8694 | 0.6264 | 0.9380 |
662
+ | 7.5496 | 11800 | 0.0708 | - | - | - | - |
663
+ | 7.6136 | 11900 | 0.0939 | - | - | - | - |
664
+ | 7.6775 | 12000 | 0.059 | 1.6282 | 0.8708 | 0.6271 | 0.9354 |
665
+ | 7.7415 | 12100 | 0.0847 | - | - | - | - |
666
+ | 7.8055 | 12200 | 0.0521 | - | - | - | - |
667
+ | 7.8375 | 12250 | - | 1.5478 | 0.8683 | 0.6359 | 0.9388 |
668
+ | 7.8695 | 12300 | 0.0394 | - | - | - | - |
669
+ | 7.9335 | 12400 | 0.0619 | - | - | - | - |
670
+ | 7.9974 | 12500 | 0.0593 | 1.5440 | 0.8771 | 0.6387 | 0.9393 |
671
+ | 8.0614 | 12600 | 0.0292 | - | - | - | - |
672
+ | 8.1254 | 12700 | 0.0267 | - | - | - | - |
673
+ | 8.1574 | 12750 | - | 1.5419 | 0.8773 | 0.6290 | 0.9388 |
674
+ | 8.1894 | 12800 | 0.0334 | - | - | - | - |
675
+ | 8.2534 | 12900 | 0.05 | - | - | - | - |
676
+ | 8.3173 | 13000 | 0.0439 | 1.5589 | 0.8740 | 0.6322 | 0.9384 |
677
+ | 8.3813 | 13100 | 0.0409 | - | - | - | - |
678
+ | 8.4453 | 13200 | 0.03 | - | - | - | - |
679
+ | 8.4773 | 13250 | - | 1.5472 | 0.8730 | 0.6347 | 0.9398 |
680
+ | 8.5093 | 13300 | 0.0373 | - | - | - | - |
681
+ | 8.5733 | 13400 | 0.0404 | - | - | - | - |
682
+ | 8.6372 | 13500 | 0.0357 | 1.5332 | 0.8749 | 0.6327 | 0.9404 |
683
+ | 8.7012 | 13600 | 0.023 | - | - | - | - |
684
+ | 8.7652 | 13700 | 0.0256 | - | - | - | - |
685
+ | 8.7972 | 13750 | - | 1.5154 | 0.8781 | 0.6337 | 0.9379 |
686
+ | 8.8292 | 13800 | 0.0563 | - | - | - | - |
687
+ | 8.8932 | 13900 | 0.029 | - | - | - | - |
688
+ | 8.9571 | 14000 | 0.0395 | 1.5503 | 0.8771 | 0.6344 | 0.9390 |
689
+ | 9.0211 | 14100 | 0.0296 | - | - | - | - |
690
+ | 9.0851 | 14200 | 0.0308 | - | - | - | - |
691
+ | 9.1171 | 14250 | - | 1.5385 | 0.8771 | 0.6363 | 0.9391 |
692
+ | 9.1491 | 14300 | 0.035 | - | - | - | - |
693
+ | 9.2131 | 14400 | 0.0217 | - | - | - | - |
694
+ | 9.2770 | 14500 | 0.0192 | 1.5592 | 0.8777 | 0.6373 | 0.9393 |
695
+ | 9.3410 | 14600 | 0.0369 | - | - | - | - |
696
+ | 9.4050 | 14700 | 0.0186 | - | - | - | - |
697
+ | 9.4370 | 14750 | - | 1.5626 | 0.8771 | 0.6368 | 0.9389 |
698
+ | 9.4690 | 14800 | 0.0303 | - | - | - | - |
699
+ | 9.5329 | 14900 | 0.0181 | - | - | - | - |
700
+ | 9.5969 | 15000 | 0.0217 | 1.5466 | 0.8782 | 0.6387 | 0.9390 |
701
+ | 9.6609 | 15100 | 0.0463 | - | - | - | - |
702
+ | 9.7249 | 15200 | 0.0211 | - | - | - | - |
703
+ | 9.7569 | 15250 | - | 1.5440 | 0.8772 | 0.6401 | 0.9395 |
704
+ | 9.7889 | 15300 | 0.0216 | - | - | - | - |
705
+ | 9.8528 | 15400 | 0.0328 | - | - | - | - |
706
+ | 9.9168 | 15500 | 0.0154 | 1.5399 | 0.8773 | 0.6393 | 0.9388 |
707
+ | 9.9808 | 15600 | 0.0263 | - | - | - | - |
708
 
709
+ </details>
710
 
711
  ### Framework Versions
712
  - Python: 3.12.9
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:c2f1edd3eb901cc4e858d3737bba31bc6509a8485a635fafd90cd21c7b74a8cc
3
  size 265462608
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ca632dc6a8785edfec19926fc59b3ac105f42543a7657f0164574f74e1ead708
3
  size 265462608