tomaarsen HF Staff commited on
Commit
d5e08e2
·
verified ·
1 Parent(s): 8f03292

Add new SentenceTransformer model

Browse files
0_StaticEmbedding/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:05608df8034bd53700b9f2187e453372ab7d0e1237d9b1e0e21af21c63382bda
3
+ size 131068000
0_StaticEmbedding/tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
README.md ADDED
@@ -0,0 +1,1020 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - en
4
+ license: apache-2.0
5
+ tags:
6
+ - sentence-transformers
7
+ - sentence-similarity
8
+ - feature-extraction
9
+ - generated_from_trainer
10
+ - dataset_size:3002496
11
+ - loss:MatryoshkaLoss
12
+ - loss:MultipleNegativesRankingLoss
13
+ widget:
14
+ - source_sentence: how to sign legal documents as power of attorney?
15
+ sentences:
16
+ - 'After the principal''s name, write “by” and then sign your own name. Under or
17
+ after the signature line, indicate your status as POA by including any of the
18
+ following identifiers: as POA, as Agent, as Attorney in Fact or as Power of Attorney.'
19
+ - '[''From the Home screen, swipe left to Apps.'', ''Tap Transfer my Data.'', ''Tap
20
+ Menu (...).'', ''Tap Export to SD card.'']'
21
+ - Ginger Dank Nugs (Grape) - 350mg. Feast your eyes on these unique and striking
22
+ gourmet chocolates; Coco Nugs created by Ginger Dank. Crafted to resemble perfect
23
+ nugs of cannabis, each of the 10 buds contains 35mg of THC. ... This is a perfect
24
+ product for both cannabis and chocolate lovers, who appreciate a little twist.
25
+ - source_sentence: how to delete vdom in fortigate?
26
+ sentences:
27
+ - Go to System -> VDOM -> VDOM2 and select 'Delete'. This VDOM is now successfully
28
+ removed from the configuration.
29
+ - 'Both combination birth control pills and progestin-only pills may cause headaches
30
+ as a side effect. Additional side effects of birth control pills may include:
31
+ breast tenderness. nausea.'
32
+ - White cheese tends to show imperfections more readily and as consumers got more
33
+ used to yellow-orange cheese, it became an expected option. Today, many cheddars
34
+ are yellow. While most cheesemakers use annatto, some use an artificial coloring
35
+ agent instead, according to Sachs.
36
+ - source_sentence: where are earthquakes most likely to occur on earth?
37
+ sentences:
38
+ - Zelle in the Bank of the America app is a fast, safe, and easy way to send and
39
+ receive money with family and friends who have a bank account in the U.S., all
40
+ with no fees. Money moves in minutes directly between accounts that are already
41
+ enrolled with Zelle.
42
+ - It takes about 3 days for a spacecraft to reach the Moon. During that time a spacecraft
43
+ travels at least 240,000 miles (386,400 kilometers) which is the distance between
44
+ Earth and the Moon.
45
+ - Most earthquakes occur along the edge of the oceanic and continental plates. The
46
+ earth's crust (the outer layer of the planet) is made up of several pieces, called
47
+ plates. The plates under the oceans are called oceanic plates and the rest are
48
+ continental plates.
49
+ - source_sentence: fix iphone is disabled connect to itunes without itunes?
50
+ sentences:
51
+ - To fix a disabled iPhone or iPad without iTunes, you have to erase your device.
52
+ Click on the "Erase iPhone" option and confirm your selection. Wait for a while
53
+ as the "Find My iPhone" feature will remotely erase your iOS device. Needless
54
+ to say, it will also disable its lock.
55
+ - How Māui brought fire to the world. One evening, after eating a hearty meal, Māui
56
+ lay beside his fire staring into the flames. ... In the middle of the night, while
57
+ everyone was sleeping, Māui went from village to village and extinguished all
58
+ the fires until not a single fire burned in the world.
59
+ - Angry Orchard makes a variety of year-round craft cider styles, including Angry
60
+ Orchard Crisp Apple, a fruit-forward hard cider that balances the sweetness of
61
+ culinary apples with dryness and bright acidity of bittersweet apples for a complex,
62
+ refreshing taste.
63
+ - source_sentence: how to reverse a video on tiktok that's not yours?
64
+ sentences:
65
+ - '[''Tap "Effects" at the bottom of your screen — it\''s an icon that looks like
66
+ a clock. Open the Effects menu. ... '', ''At the end of the new list that appears,
67
+ tap "Time." Select "Time" at the end. ... '', ''Select "Reverse" — you\''ll then
68
+ see a preview of your new, reversed video appear on the screen.'']'
69
+ - Franchise Facts Poke Bar has a franchise fee of up to $30,000, with a total initial
70
+ investment range of $157,800 to $438,000. The initial cost of a franchise includes
71
+ several fees -- Unlock this franchise to better understand the costs such as training
72
+ and territory fees.
73
+ - Relative age is the age of a rock layer (or the fossils it contains) compared
74
+ to other layers. It can be determined by looking at the position of rock layers.
75
+ Absolute age is the numeric age of a layer of rocks or fossils. Absolute age can
76
+ be determined by using radiometric dating.
77
+ datasets:
78
+ - sentence-transformers/gooaq
79
+ pipeline_tag: sentence-similarity
80
+ library_name: sentence-transformers
81
+ metrics:
82
+ - cosine_accuracy@1
83
+ - cosine_accuracy@3
84
+ - cosine_accuracy@5
85
+ - cosine_accuracy@10
86
+ - cosine_precision@1
87
+ - cosine_precision@3
88
+ - cosine_precision@5
89
+ - cosine_precision@10
90
+ - cosine_recall@1
91
+ - cosine_recall@3
92
+ - cosine_recall@5
93
+ - cosine_recall@10
94
+ - cosine_ndcg@10
95
+ - cosine_mrr@10
96
+ - cosine_map@100
97
+ co2_eq_emissions:
98
+ emissions: 9.99163724446148
99
+ energy_consumed: 0.025705134639033192
100
+ source: codecarbon
101
+ training_type: fine-tuning
102
+ on_cloud: false
103
+ cpu_model: 13th Gen Intel(R) Core(TM) i7-13700K
104
+ ram_total_size: 31.777088165283203
105
+ hours_used: 0.173
106
+ hardware_used: 1 x NVIDIA GeForce RTX 3090
107
+ model-index:
108
+ - name: Static Embeddings with BEE-spoke-data/wordpiece-tokenizer-32k-en_code-msp
109
+ tokenizer finetuned on GooAQ pairs
110
+ results:
111
+ - task:
112
+ type: information-retrieval
113
+ name: Information Retrieval
114
+ dataset:
115
+ name: gooaq 1024 dev
116
+ type: gooaq-1024-dev
117
+ metrics:
118
+ - type: cosine_accuracy@1
119
+ value: 0.6374
120
+ name: Cosine Accuracy@1
121
+ - type: cosine_accuracy@3
122
+ value: 0.8431
123
+ name: Cosine Accuracy@3
124
+ - type: cosine_accuracy@5
125
+ value: 0.9006
126
+ name: Cosine Accuracy@5
127
+ - type: cosine_accuracy@10
128
+ value: 0.9474
129
+ name: Cosine Accuracy@10
130
+ - type: cosine_precision@1
131
+ value: 0.6374
132
+ name: Cosine Precision@1
133
+ - type: cosine_precision@3
134
+ value: 0.2810333333333333
135
+ name: Cosine Precision@3
136
+ - type: cosine_precision@5
137
+ value: 0.18012000000000003
138
+ name: Cosine Precision@5
139
+ - type: cosine_precision@10
140
+ value: 0.09474000000000002
141
+ name: Cosine Precision@10
142
+ - type: cosine_recall@1
143
+ value: 0.6374
144
+ name: Cosine Recall@1
145
+ - type: cosine_recall@3
146
+ value: 0.8431
147
+ name: Cosine Recall@3
148
+ - type: cosine_recall@5
149
+ value: 0.9006
150
+ name: Cosine Recall@5
151
+ - type: cosine_recall@10
152
+ value: 0.9474
153
+ name: Cosine Recall@10
154
+ - type: cosine_ndcg@10
155
+ value: 0.7978862529985916
156
+ name: Cosine Ndcg@10
157
+ - type: cosine_mrr@10
158
+ value: 0.749247539682533
159
+ name: Cosine Mrr@10
160
+ - type: cosine_map@100
161
+ value: 0.7516276441525049
162
+ name: Cosine Map@100
163
+ - task:
164
+ type: information-retrieval
165
+ name: Information Retrieval
166
+ dataset:
167
+ name: gooaq 512 dev
168
+ type: gooaq-512-dev
169
+ metrics:
170
+ - type: cosine_accuracy@1
171
+ value: 0.6324
172
+ name: Cosine Accuracy@1
173
+ - type: cosine_accuracy@3
174
+ value: 0.8382
175
+ name: Cosine Accuracy@3
176
+ - type: cosine_accuracy@5
177
+ value: 0.899
178
+ name: Cosine Accuracy@5
179
+ - type: cosine_accuracy@10
180
+ value: 0.9452
181
+ name: Cosine Accuracy@10
182
+ - type: cosine_precision@1
183
+ value: 0.6324
184
+ name: Cosine Precision@1
185
+ - type: cosine_precision@3
186
+ value: 0.2794
187
+ name: Cosine Precision@3
188
+ - type: cosine_precision@5
189
+ value: 0.17980000000000002
190
+ name: Cosine Precision@5
191
+ - type: cosine_precision@10
192
+ value: 0.09452000000000002
193
+ name: Cosine Precision@10
194
+ - type: cosine_recall@1
195
+ value: 0.6324
196
+ name: Cosine Recall@1
197
+ - type: cosine_recall@3
198
+ value: 0.8382
199
+ name: Cosine Recall@3
200
+ - type: cosine_recall@5
201
+ value: 0.899
202
+ name: Cosine Recall@5
203
+ - type: cosine_recall@10
204
+ value: 0.9452
205
+ name: Cosine Recall@10
206
+ - type: cosine_ndcg@10
207
+ value: 0.7945424013494014
208
+ name: Cosine Ndcg@10
209
+ - type: cosine_mrr@10
210
+ value: 0.7454914682539613
211
+ name: Cosine Mrr@10
212
+ - type: cosine_map@100
213
+ value: 0.7479805815427163
214
+ name: Cosine Map@100
215
+ - task:
216
+ type: information-retrieval
217
+ name: Information Retrieval
218
+ dataset:
219
+ name: gooaq 256 dev
220
+ type: gooaq-256-dev
221
+ metrics:
222
+ - type: cosine_accuracy@1
223
+ value: 0.6273
224
+ name: Cosine Accuracy@1
225
+ - type: cosine_accuracy@3
226
+ value: 0.8299
227
+ name: Cosine Accuracy@3
228
+ - type: cosine_accuracy@5
229
+ value: 0.8919
230
+ name: Cosine Accuracy@5
231
+ - type: cosine_accuracy@10
232
+ value: 0.9414
233
+ name: Cosine Accuracy@10
234
+ - type: cosine_precision@1
235
+ value: 0.6273
236
+ name: Cosine Precision@1
237
+ - type: cosine_precision@3
238
+ value: 0.2766333333333333
239
+ name: Cosine Precision@3
240
+ - type: cosine_precision@5
241
+ value: 0.17838
242
+ name: Cosine Precision@5
243
+ - type: cosine_precision@10
244
+ value: 0.09414000000000002
245
+ name: Cosine Precision@10
246
+ - type: cosine_recall@1
247
+ value: 0.6273
248
+ name: Cosine Recall@1
249
+ - type: cosine_recall@3
250
+ value: 0.8299
251
+ name: Cosine Recall@3
252
+ - type: cosine_recall@5
253
+ value: 0.8919
254
+ name: Cosine Recall@5
255
+ - type: cosine_recall@10
256
+ value: 0.9414
257
+ name: Cosine Recall@10
258
+ - type: cosine_ndcg@10
259
+ value: 0.7887635975389343
260
+ name: Cosine Ndcg@10
261
+ - type: cosine_mrr@10
262
+ value: 0.7392410317460252
263
+ name: Cosine Mrr@10
264
+ - type: cosine_map@100
265
+ value: 0.7417892083520236
266
+ name: Cosine Map@100
267
+ - task:
268
+ type: information-retrieval
269
+ name: Information Retrieval
270
+ dataset:
271
+ name: gooaq 128 dev
272
+ type: gooaq-128-dev
273
+ metrics:
274
+ - type: cosine_accuracy@1
275
+ value: 0.6009
276
+ name: Cosine Accuracy@1
277
+ - type: cosine_accuracy@3
278
+ value: 0.8112
279
+ name: Cosine Accuracy@3
280
+ - type: cosine_accuracy@5
281
+ value: 0.874
282
+ name: Cosine Accuracy@5
283
+ - type: cosine_accuracy@10
284
+ value: 0.9296
285
+ name: Cosine Accuracy@10
286
+ - type: cosine_precision@1
287
+ value: 0.6009
288
+ name: Cosine Precision@1
289
+ - type: cosine_precision@3
290
+ value: 0.2704
291
+ name: Cosine Precision@3
292
+ - type: cosine_precision@5
293
+ value: 0.1748
294
+ name: Cosine Precision@5
295
+ - type: cosine_precision@10
296
+ value: 0.09296000000000001
297
+ name: Cosine Precision@10
298
+ - type: cosine_recall@1
299
+ value: 0.6009
300
+ name: Cosine Recall@1
301
+ - type: cosine_recall@3
302
+ value: 0.8112
303
+ name: Cosine Recall@3
304
+ - type: cosine_recall@5
305
+ value: 0.874
306
+ name: Cosine Recall@5
307
+ - type: cosine_recall@10
308
+ value: 0.9296
309
+ name: Cosine Recall@10
310
+ - type: cosine_ndcg@10
311
+ value: 0.7687659876727208
312
+ name: Cosine Ndcg@10
313
+ - type: cosine_mrr@10
314
+ value: 0.7167269444444387
315
+ name: Cosine Mrr@10
316
+ - type: cosine_map@100
317
+ value: 0.7196420328190362
318
+ name: Cosine Map@100
319
+ - task:
320
+ type: information-retrieval
321
+ name: Information Retrieval
322
+ dataset:
323
+ name: gooaq 64 dev
324
+ type: gooaq-64-dev
325
+ metrics:
326
+ - type: cosine_accuracy@1
327
+ value: 0.5589
328
+ name: Cosine Accuracy@1
329
+ - type: cosine_accuracy@3
330
+ value: 0.7641
331
+ name: Cosine Accuracy@3
332
+ - type: cosine_accuracy@5
333
+ value: 0.8367
334
+ name: Cosine Accuracy@5
335
+ - type: cosine_accuracy@10
336
+ value: 0.8995
337
+ name: Cosine Accuracy@10
338
+ - type: cosine_precision@1
339
+ value: 0.5589
340
+ name: Cosine Precision@1
341
+ - type: cosine_precision@3
342
+ value: 0.2547
343
+ name: Cosine Precision@3
344
+ - type: cosine_precision@5
345
+ value: 0.16734000000000002
346
+ name: Cosine Precision@5
347
+ - type: cosine_precision@10
348
+ value: 0.08995
349
+ name: Cosine Precision@10
350
+ - type: cosine_recall@1
351
+ value: 0.5589
352
+ name: Cosine Recall@1
353
+ - type: cosine_recall@3
354
+ value: 0.7641
355
+ name: Cosine Recall@3
356
+ - type: cosine_recall@5
357
+ value: 0.8367
358
+ name: Cosine Recall@5
359
+ - type: cosine_recall@10
360
+ value: 0.8995
361
+ name: Cosine Recall@10
362
+ - type: cosine_ndcg@10
363
+ value: 0.729110593943954
364
+ name: Cosine Ndcg@10
365
+ - type: cosine_mrr@10
366
+ value: 0.6743490476190415
367
+ name: Cosine Mrr@10
368
+ - type: cosine_map@100
369
+ value: 0.6781847101023228
370
+ name: Cosine Map@100
371
+ - task:
372
+ type: information-retrieval
373
+ name: Information Retrieval
374
+ dataset:
375
+ name: gooaq 32 dev
376
+ type: gooaq-32-dev
377
+ metrics:
378
+ - type: cosine_accuracy@1
379
+ value: 0.4745
380
+ name: Cosine Accuracy@1
381
+ - type: cosine_accuracy@3
382
+ value: 0.6696
383
+ name: Cosine Accuracy@3
384
+ - type: cosine_accuracy@5
385
+ value: 0.7508
386
+ name: Cosine Accuracy@5
387
+ - type: cosine_accuracy@10
388
+ value: 0.8302
389
+ name: Cosine Accuracy@10
390
+ - type: cosine_precision@1
391
+ value: 0.4745
392
+ name: Cosine Precision@1
393
+ - type: cosine_precision@3
394
+ value: 0.2232
395
+ name: Cosine Precision@3
396
+ - type: cosine_precision@5
397
+ value: 0.15016
398
+ name: Cosine Precision@5
399
+ - type: cosine_precision@10
400
+ value: 0.08302000000000001
401
+ name: Cosine Precision@10
402
+ - type: cosine_recall@1
403
+ value: 0.4745
404
+ name: Cosine Recall@1
405
+ - type: cosine_recall@3
406
+ value: 0.6696
407
+ name: Cosine Recall@3
408
+ - type: cosine_recall@5
409
+ value: 0.7508
410
+ name: Cosine Recall@5
411
+ - type: cosine_recall@10
412
+ value: 0.8302
413
+ name: Cosine Recall@10
414
+ - type: cosine_ndcg@10
415
+ value: 0.6483028649168259
416
+ name: Cosine Ndcg@10
417
+ - type: cosine_mrr@10
418
+ value: 0.5904344444444417
419
+ name: Cosine Mrr@10
420
+ - type: cosine_map@100
421
+ value: 0.596236275120416
422
+ name: Cosine Map@100
423
+ ---
424
+
425
+ # Static Embeddings with BEE-spoke-data/wordpiece-tokenizer-32k-en_code-msp tokenizer finetuned on GooAQ pairs
426
+
427
+ This is a [sentence-transformers](https://www.SBERT.net) model trained on the [gooaq](https://huggingface.co/datasets/sentence-transformers/gooaq) dataset. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
428
+
429
+ ## Model Details
430
+
431
+ ### Model Description
432
+ - **Model Type:** Sentence Transformer
433
+ <!-- - **Base model:** [Unknown](https://huggingface.co/unknown) -->
434
+ - **Maximum Sequence Length:** inf tokens
435
+ - **Output Dimensionality:** 1024 dimensions
436
+ - **Similarity Function:** Cosine Similarity
437
+ - **Training Dataset:**
438
+ - [gooaq](https://huggingface.co/datasets/sentence-transformers/gooaq)
439
+ - **Language:** en
440
+ - **License:** apache-2.0
441
+
442
+ ### Model Sources
443
+
444
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
445
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
446
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
447
+
448
+ ### Full Model Architecture
449
+
450
+ ```
451
+ SentenceTransformer(
452
+ (0): StaticEmbedding(
453
+ (embedding): EmbeddingBag(31999, 1024, mode='mean')
454
+ )
455
+ )
456
+ ```
457
+
458
+ ## Usage
459
+
460
+ ### Direct Usage (Sentence Transformers)
461
+
462
+ First install the Sentence Transformers library:
463
+
464
+ ```bash
465
+ pip install -U sentence-transformers
466
+ ```
467
+
468
+ Then you can load this model and run inference.
469
+ ```python
470
+ from sentence_transformers import SentenceTransformer
471
+
472
+ # Download from the 🤗 Hub
473
+ model = SentenceTransformer("tomaarsen/static-BEE-spoke-data-tokenizer-v1-gooaq")
474
+ # Run inference
475
+ sentences = [
476
+ "how to reverse a video on tiktok that's not yours?",
477
+ '[\'Tap "Effects" at the bottom of your screen — it\\\'s an icon that looks like a clock. Open the Effects menu. ... \', \'At the end of the new list that appears, tap "Time." Select "Time" at the end. ... \', \'Select "Reverse" — you\\\'ll then see a preview of your new, reversed video appear on the screen.\']',
478
+ 'Relative age is the age of a rock layer (or the fossils it contains) compared to other layers. It can be determined by looking at the position of rock layers. Absolute age is the numeric age of a layer of rocks or fossils. Absolute age can be determined by using radiometric dating.',
479
+ ]
480
+ embeddings = model.encode(sentences)
481
+ print(embeddings.shape)
482
+ # [3, 1024]
483
+
484
+ # Get the similarity scores for the embeddings
485
+ similarities = model.similarity(embeddings, embeddings)
486
+ print(similarities.shape)
487
+ # [3, 3]
488
+ ```
489
+
490
+ <!--
491
+ ### Direct Usage (Transformers)
492
+
493
+ <details><summary>Click to see the direct usage in Transformers</summary>
494
+
495
+ </details>
496
+ -->
497
+
498
+ <!--
499
+ ### Downstream Usage (Sentence Transformers)
500
+
501
+ You can finetune this model on your own dataset.
502
+
503
+ <details><summary>Click to expand</summary>
504
+
505
+ </details>
506
+ -->
507
+
508
+ <!--
509
+ ### Out-of-Scope Use
510
+
511
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
512
+ -->
513
+
514
+ ## Evaluation
515
+
516
+ ### Metrics
517
+
518
+ #### Information Retrieval
519
+
520
+ * Dataset: `gooaq-1024-dev`
521
+ * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:
522
+ ```json
523
+ {
524
+ "truncate_dim": 1024
525
+ }
526
+ ```
527
+
528
+ | Metric | Value |
529
+ |:--------------------|:-----------|
530
+ | cosine_accuracy@1 | 0.6374 |
531
+ | cosine_accuracy@3 | 0.8431 |
532
+ | cosine_accuracy@5 | 0.9006 |
533
+ | cosine_accuracy@10 | 0.9474 |
534
+ | cosine_precision@1 | 0.6374 |
535
+ | cosine_precision@3 | 0.281 |
536
+ | cosine_precision@5 | 0.1801 |
537
+ | cosine_precision@10 | 0.0947 |
538
+ | cosine_recall@1 | 0.6374 |
539
+ | cosine_recall@3 | 0.8431 |
540
+ | cosine_recall@5 | 0.9006 |
541
+ | cosine_recall@10 | 0.9474 |
542
+ | **cosine_ndcg@10** | **0.7979** |
543
+ | cosine_mrr@10 | 0.7492 |
544
+ | cosine_map@100 | 0.7516 |
545
+
546
+ #### Information Retrieval
547
+
548
+ * Dataset: `gooaq-512-dev`
549
+ * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:
550
+ ```json
551
+ {
552
+ "truncate_dim": 512
553
+ }
554
+ ```
555
+
556
+ | Metric | Value |
557
+ |:--------------------|:-----------|
558
+ | cosine_accuracy@1 | 0.6324 |
559
+ | cosine_accuracy@3 | 0.8382 |
560
+ | cosine_accuracy@5 | 0.899 |
561
+ | cosine_accuracy@10 | 0.9452 |
562
+ | cosine_precision@1 | 0.6324 |
563
+ | cosine_precision@3 | 0.2794 |
564
+ | cosine_precision@5 | 0.1798 |
565
+ | cosine_precision@10 | 0.0945 |
566
+ | cosine_recall@1 | 0.6324 |
567
+ | cosine_recall@3 | 0.8382 |
568
+ | cosine_recall@5 | 0.899 |
569
+ | cosine_recall@10 | 0.9452 |
570
+ | **cosine_ndcg@10** | **0.7945** |
571
+ | cosine_mrr@10 | 0.7455 |
572
+ | cosine_map@100 | 0.748 |
573
+
574
+ #### Information Retrieval
575
+
576
+ * Dataset: `gooaq-256-dev`
577
+ * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:
578
+ ```json
579
+ {
580
+ "truncate_dim": 256
581
+ }
582
+ ```
583
+
584
+ | Metric | Value |
585
+ |:--------------------|:-----------|
586
+ | cosine_accuracy@1 | 0.6273 |
587
+ | cosine_accuracy@3 | 0.8299 |
588
+ | cosine_accuracy@5 | 0.8919 |
589
+ | cosine_accuracy@10 | 0.9414 |
590
+ | cosine_precision@1 | 0.6273 |
591
+ | cosine_precision@3 | 0.2766 |
592
+ | cosine_precision@5 | 0.1784 |
593
+ | cosine_precision@10 | 0.0941 |
594
+ | cosine_recall@1 | 0.6273 |
595
+ | cosine_recall@3 | 0.8299 |
596
+ | cosine_recall@5 | 0.8919 |
597
+ | cosine_recall@10 | 0.9414 |
598
+ | **cosine_ndcg@10** | **0.7888** |
599
+ | cosine_mrr@10 | 0.7392 |
600
+ | cosine_map@100 | 0.7418 |
601
+
602
+ #### Information Retrieval
603
+
604
+ * Dataset: `gooaq-128-dev`
605
+ * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:
606
+ ```json
607
+ {
608
+ "truncate_dim": 128
609
+ }
610
+ ```
611
+
612
+ | Metric | Value |
613
+ |:--------------------|:-----------|
614
+ | cosine_accuracy@1 | 0.6009 |
615
+ | cosine_accuracy@3 | 0.8112 |
616
+ | cosine_accuracy@5 | 0.874 |
617
+ | cosine_accuracy@10 | 0.9296 |
618
+ | cosine_precision@1 | 0.6009 |
619
+ | cosine_precision@3 | 0.2704 |
620
+ | cosine_precision@5 | 0.1748 |
621
+ | cosine_precision@10 | 0.093 |
622
+ | cosine_recall@1 | 0.6009 |
623
+ | cosine_recall@3 | 0.8112 |
624
+ | cosine_recall@5 | 0.874 |
625
+ | cosine_recall@10 | 0.9296 |
626
+ | **cosine_ndcg@10** | **0.7688** |
627
+ | cosine_mrr@10 | 0.7167 |
628
+ | cosine_map@100 | 0.7196 |
629
+
630
+ #### Information Retrieval
631
+
632
+ * Dataset: `gooaq-64-dev`
633
+ * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:
634
+ ```json
635
+ {
636
+ "truncate_dim": 64
637
+ }
638
+ ```
639
+
640
+ | Metric | Value |
641
+ |:--------------------|:-----------|
642
+ | cosine_accuracy@1 | 0.5589 |
643
+ | cosine_accuracy@3 | 0.7641 |
644
+ | cosine_accuracy@5 | 0.8367 |
645
+ | cosine_accuracy@10 | 0.8995 |
646
+ | cosine_precision@1 | 0.5589 |
647
+ | cosine_precision@3 | 0.2547 |
648
+ | cosine_precision@5 | 0.1673 |
649
+ | cosine_precision@10 | 0.09 |
650
+ | cosine_recall@1 | 0.5589 |
651
+ | cosine_recall@3 | 0.7641 |
652
+ | cosine_recall@5 | 0.8367 |
653
+ | cosine_recall@10 | 0.8995 |
654
+ | **cosine_ndcg@10** | **0.7291** |
655
+ | cosine_mrr@10 | 0.6743 |
656
+ | cosine_map@100 | 0.6782 |
657
+
658
+ #### Information Retrieval
659
+
660
+ * Dataset: `gooaq-32-dev`
661
+ * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:
662
+ ```json
663
+ {
664
+ "truncate_dim": 32
665
+ }
666
+ ```
667
+
668
+ | Metric | Value |
669
+ |:--------------------|:-----------|
670
+ | cosine_accuracy@1 | 0.4745 |
671
+ | cosine_accuracy@3 | 0.6696 |
672
+ | cosine_accuracy@5 | 0.7508 |
673
+ | cosine_accuracy@10 | 0.8302 |
674
+ | cosine_precision@1 | 0.4745 |
675
+ | cosine_precision@3 | 0.2232 |
676
+ | cosine_precision@5 | 0.1502 |
677
+ | cosine_precision@10 | 0.083 |
678
+ | cosine_recall@1 | 0.4745 |
679
+ | cosine_recall@3 | 0.6696 |
680
+ | cosine_recall@5 | 0.7508 |
681
+ | cosine_recall@10 | 0.8302 |
682
+ | **cosine_ndcg@10** | **0.6483** |
683
+ | cosine_mrr@10 | 0.5904 |
684
+ | cosine_map@100 | 0.5962 |
685
+
686
+ <!--
687
+ ## Bias, Risks and Limitations
688
+
689
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
690
+ -->
691
+
692
+ <!--
693
+ ### Recommendations
694
+
695
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
696
+ -->
697
+
698
+ ## Training Details
699
+
700
+ ### Training Dataset
701
+
702
+ #### gooaq
703
+
704
+ * Dataset: [gooaq](https://huggingface.co/datasets/sentence-transformers/gooaq) at [b089f72](https://huggingface.co/datasets/sentence-transformers/gooaq/tree/b089f728748a068b7bc5234e5bcf5b25e3c8279c)
705
+ * Size: 3,002,496 training samples
706
+ * Columns: <code>question</code> and <code>answer</code>
707
+ * Approximate statistics based on the first 1000 samples:
708
+ | | question | answer |
709
+ |:--------|:-----------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------|
710
+ | type | string | string |
711
+ | details | <ul><li>min: 18 characters</li><li>mean: 43.23 characters</li><li>max: 96 characters</li></ul> | <ul><li>min: 55 characters</li><li>mean: 253.36 characters</li><li>max: 371 characters</li></ul> |
712
+ * Samples:
713
+ | question | answer |
714
+ |:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
715
+ | <code>what is the difference between broilers and layers?</code> | <code>An egg laying poultry is called egger or layer whereas broilers are reared for obtaining meat. So a layer should be able to produce more number of large sized eggs, without growing too much. On the other hand, a broiler should yield more meat and hence should be able to grow well.</code> |
716
+ | <code>what is the difference between chronological order and spatial order?</code> | <code>As a writer, you should always remember that unlike chronological order and the other organizational methods for data, spatial order does not take into account the time. Spatial order is primarily focused on the location. All it does is take into account the location of objects and not the time.</code> |
717
+ | <code>is kamagra same as viagra?</code> | <code>Kamagra is thought to contain the same active ingredient as Viagra, sildenafil citrate. In theory, it should work in much the same way as Viagra, taking about 45 minutes to take effect, and lasting for around 4-6 hours. However, this will vary from person to person.</code> |
718
+ * Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
719
+ ```json
720
+ {
721
+ "loss": "MultipleNegativesRankingLoss",
722
+ "matryoshka_dims": [
723
+ 1024,
724
+ 512,
725
+ 256,
726
+ 128,
727
+ 64,
728
+ 32
729
+ ],
730
+ "matryoshka_weights": [
731
+ 1,
732
+ 1,
733
+ 1,
734
+ 1,
735
+ 1,
736
+ 1
737
+ ],
738
+ "n_dims_per_step": -1
739
+ }
740
+ ```
741
+
742
+ ### Evaluation Dataset
743
+
744
+ #### gooaq
745
+
746
+ * Dataset: [gooaq](https://huggingface.co/datasets/sentence-transformers/gooaq) at [b089f72](https://huggingface.co/datasets/sentence-transformers/gooaq/tree/b089f728748a068b7bc5234e5bcf5b25e3c8279c)
747
+ * Size: 10,000 evaluation samples
748
+ * Columns: <code>question</code> and <code>answer</code>
749
+ * Approximate statistics based on the first 1000 samples:
750
+ | | question | answer |
751
+ |:--------|:-----------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------|
752
+ | type | string | string |
753
+ | details | <ul><li>min: 18 characters</li><li>mean: 43.17 characters</li><li>max: 98 characters</li></ul> | <ul><li>min: 51 characters</li><li>mean: 254.12 characters</li><li>max: 360 characters</li></ul> |
754
+ * Samples:
755
+ | question | answer |
756
+ |:-----------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
757
+ | <code>how do i program my directv remote with my tv?</code> | <code>['Press MENU on your remote.', 'Select Settings & Help > Settings > Remote Control > Program Remote.', 'Choose the device (TV, audio, DVD) you wish to program. ... ', 'Follow the on-screen prompts to complete programming.']</code> |
758
+ | <code>are rodrigues fruit bats nocturnal?</code> | <code>Before its numbers were threatened by habitat destruction, storms, and hunting, some of those groups could number 500 or more members. Sunrise, sunset. Rodrigues fruit bats are most active at dawn, at dusk, and at night.</code> |
759
+ | <code>why does your heart rate increase during exercise bbc bitesize?</code> | <code>During exercise there is an increase in physical activity and muscle cells respire more than they do when the body is at rest. The heart rate increases during exercise. The rate and depth of breathing increases - this makes sure that more oxygen is absorbed into the blood, and more carbon dioxide is removed from it.</code> |
760
+ * Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
761
+ ```json
762
+ {
763
+ "loss": "MultipleNegativesRankingLoss",
764
+ "matryoshka_dims": [
765
+ 1024,
766
+ 512,
767
+ 256,
768
+ 128,
769
+ 64,
770
+ 32
771
+ ],
772
+ "matryoshka_weights": [
773
+ 1,
774
+ 1,
775
+ 1,
776
+ 1,
777
+ 1,
778
+ 1
779
+ ],
780
+ "n_dims_per_step": -1
781
+ }
782
+ ```
783
+
784
+ ### Training Hyperparameters
785
+ #### Non-Default Hyperparameters
786
+
787
+ - `eval_strategy`: steps
788
+ - `per_device_train_batch_size`: 2048
789
+ - `per_device_eval_batch_size`: 2048
790
+ - `learning_rate`: 0.2
791
+ - `num_train_epochs`: 1
792
+ - `warmup_ratio`: 0.1
793
+ - `bf16`: True
794
+ - `batch_sampler`: no_duplicates
795
+
796
+ #### All Hyperparameters
797
+ <details><summary>Click to expand</summary>
798
+
799
+ - `overwrite_output_dir`: False
800
+ - `do_predict`: False
801
+ - `eval_strategy`: steps
802
+ - `prediction_loss_only`: True
803
+ - `per_device_train_batch_size`: 2048
804
+ - `per_device_eval_batch_size`: 2048
805
+ - `per_gpu_train_batch_size`: None
806
+ - `per_gpu_eval_batch_size`: None
807
+ - `gradient_accumulation_steps`: 1
808
+ - `eval_accumulation_steps`: None
809
+ - `torch_empty_cache_steps`: None
810
+ - `learning_rate`: 0.2
811
+ - `weight_decay`: 0.0
812
+ - `adam_beta1`: 0.9
813
+ - `adam_beta2`: 0.999
814
+ - `adam_epsilon`: 1e-08
815
+ - `max_grad_norm`: 1.0
816
+ - `num_train_epochs`: 1
817
+ - `max_steps`: -1
818
+ - `lr_scheduler_type`: linear
819
+ - `lr_scheduler_kwargs`: {}
820
+ - `warmup_ratio`: 0.1
821
+ - `warmup_steps`: 0
822
+ - `log_level`: passive
823
+ - `log_level_replica`: warning
824
+ - `log_on_each_node`: True
825
+ - `logging_nan_inf_filter`: True
826
+ - `save_safetensors`: True
827
+ - `save_on_each_node`: False
828
+ - `save_only_model`: False
829
+ - `restore_callback_states_from_checkpoint`: False
830
+ - `no_cuda`: False
831
+ - `use_cpu`: False
832
+ - `use_mps_device`: False
833
+ - `seed`: 42
834
+ - `data_seed`: None
835
+ - `jit_mode_eval`: False
836
+ - `use_ipex`: False
837
+ - `bf16`: True
838
+ - `fp16`: False
839
+ - `fp16_opt_level`: O1
840
+ - `half_precision_backend`: auto
841
+ - `bf16_full_eval`: False
842
+ - `fp16_full_eval`: False
843
+ - `tf32`: None
844
+ - `local_rank`: 0
845
+ - `ddp_backend`: None
846
+ - `tpu_num_cores`: None
847
+ - `tpu_metrics_debug`: False
848
+ - `debug`: []
849
+ - `dataloader_drop_last`: False
850
+ - `dataloader_num_workers`: 0
851
+ - `dataloader_prefetch_factor`: None
852
+ - `past_index`: -1
853
+ - `disable_tqdm`: False
854
+ - `remove_unused_columns`: True
855
+ - `label_names`: None
856
+ - `load_best_model_at_end`: False
857
+ - `ignore_data_skip`: False
858
+ - `fsdp`: []
859
+ - `fsdp_min_num_params`: 0
860
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
861
+ - `fsdp_transformer_layer_cls_to_wrap`: None
862
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
863
+ - `deepspeed`: None
864
+ - `label_smoothing_factor`: 0.0
865
+ - `optim`: adamw_torch
866
+ - `optim_args`: None
867
+ - `adafactor`: False
868
+ - `group_by_length`: False
869
+ - `length_column_name`: length
870
+ - `ddp_find_unused_parameters`: None
871
+ - `ddp_bucket_cap_mb`: None
872
+ - `ddp_broadcast_buffers`: False
873
+ - `dataloader_pin_memory`: True
874
+ - `dataloader_persistent_workers`: False
875
+ - `skip_memory_metrics`: True
876
+ - `use_legacy_prediction_loop`: False
877
+ - `push_to_hub`: False
878
+ - `resume_from_checkpoint`: None
879
+ - `hub_model_id`: None
880
+ - `hub_strategy`: every_save
881
+ - `hub_private_repo`: None
882
+ - `hub_always_push`: False
883
+ - `gradient_checkpointing`: False
884
+ - `gradient_checkpointing_kwargs`: None
885
+ - `include_inputs_for_metrics`: False
886
+ - `include_for_metrics`: []
887
+ - `eval_do_concat_batches`: True
888
+ - `fp16_backend`: auto
889
+ - `push_to_hub_model_id`: None
890
+ - `push_to_hub_organization`: None
891
+ - `mp_parameters`:
892
+ - `auto_find_batch_size`: False
893
+ - `full_determinism`: False
894
+ - `torchdynamo`: None
895
+ - `ray_scope`: last
896
+ - `ddp_timeout`: 1800
897
+ - `torch_compile`: False
898
+ - `torch_compile_backend`: None
899
+ - `torch_compile_mode`: None
900
+ - `dispatch_batches`: None
901
+ - `split_batches`: None
902
+ - `include_tokens_per_second`: False
903
+ - `include_num_input_tokens_seen`: False
904
+ - `neftune_noise_alpha`: None
905
+ - `optim_target_modules`: None
906
+ - `batch_eval_metrics`: False
907
+ - `eval_on_start`: False
908
+ - `use_liger_kernel`: False
909
+ - `eval_use_gather_object`: False
910
+ - `average_tokens_across_devices`: False
911
+ - `prompts`: None
912
+ - `batch_sampler`: no_duplicates
913
+ - `multi_dataset_batch_sampler`: proportional
914
+
915
+ </details>
916
+
917
+ ### Training Logs
918
+ | Epoch | Step | Training Loss | Validation Loss | gooaq-1024-dev_cosine_ndcg@10 | gooaq-512-dev_cosine_ndcg@10 | gooaq-256-dev_cosine_ndcg@10 | gooaq-128-dev_cosine_ndcg@10 | gooaq-64-dev_cosine_ndcg@10 | gooaq-32-dev_cosine_ndcg@10 |
919
+ |:------:|:----:|:-------------:|:---------------:|:-----------------------------:|:----------------------------:|:----------------------------:|:----------------------------:|:---------------------------:|:---------------------------:|
920
+ | -1 | -1 | - | - | 0.2283 | 0.2131 | 0.1847 | 0.1395 | 0.0746 | 0.0334 |
921
+ | 0.0007 | 1 | 44.3995 | - | - | - | - | - | - | - |
922
+ | 0.0682 | 100 | 20.4944 | - | - | - | - | - | - | - |
923
+ | 0.1363 | 200 | 5.7432 | - | - | - | - | - | - | - |
924
+ | 0.1704 | 250 | - | 1.6135 | 0.7337 | 0.7307 | 0.7204 | 0.7006 | 0.6527 | 0.5522 |
925
+ | 0.2045 | 300 | 4.6818 | - | - | - | - | - | - | - |
926
+ | 0.2727 | 400 | 4.237 | - | - | - | - | - | - | - |
927
+ | 0.3408 | 500 | 3.9465 | 1.3375 | 0.7628 | 0.7601 | 0.7544 | 0.7340 | 0.6917 | 0.6024 |
928
+ | 0.4090 | 600 | 3.724 | - | - | - | - | - | - | - |
929
+ | 0.4772 | 700 | 3.5496 | - | - | - | - | - | - | - |
930
+ | 0.5112 | 750 | - | 1.2214 | 0.7782 | 0.7764 | 0.7676 | 0.7492 | 0.7075 | 0.6208 |
931
+ | 0.5453 | 800 | 3.4443 | - | - | - | - | - | - | - |
932
+ | 0.6135 | 900 | 3.3312 | - | - | - | - | - | - | - |
933
+ | 0.6817 | 1000 | 3.2537 | 1.1280 | 0.7877 | 0.7841 | 0.7768 | 0.7582 | 0.7195 | 0.6336 |
934
+ | 0.7498 | 1100 | 3.1613 | - | - | - | - | - | - | - |
935
+ | 0.8180 | 1200 | 3.0985 | - | - | - | - | - | - | - |
936
+ | 0.8521 | 1250 | - | 1.0693 | 0.7955 | 0.7922 | 0.7858 | 0.7663 | 0.7267 | 0.6434 |
937
+ | 0.8862 | 1300 | 3.0416 | - | - | - | - | - | - | - |
938
+ | 0.9543 | 1400 | 3.0249 | - | - | - | - | - | - | - |
939
+ | -1 | -1 | - | - | 0.7979 | 0.7945 | 0.7888 | 0.7688 | 0.7291 | 0.6483 |
940
+
941
+
942
+ ### Environmental Impact
943
+ Carbon emissions were measured using [CodeCarbon](https://github.com/mlco2/codecarbon).
944
+ - **Energy Consumed**: 0.026 kWh
945
+ - **Carbon Emitted**: 0.010 kg of CO2
946
+ - **Hours Used**: 0.173 hours
947
+
948
+ ### Training Hardware
949
+ - **On Cloud**: No
950
+ - **GPU Model**: 1 x NVIDIA GeForce RTX 3090
951
+ - **CPU Model**: 13th Gen Intel(R) Core(TM) i7-13700K
952
+ - **RAM Size**: 31.78 GB
953
+
954
+ ### Framework Versions
955
+ - Python: 3.11.6
956
+ - Sentence Transformers: 4.1.0.dev0
957
+ - Transformers: 4.49.0
958
+ - PyTorch: 2.6.0+cu124
959
+ - Accelerate: 1.5.1
960
+ - Datasets: 3.3.2
961
+ - Tokenizers: 0.21.0
962
+
963
+ ## Citation
964
+
965
+ ### BibTeX
966
+
967
+ #### Sentence Transformers
968
+ ```bibtex
969
+ @inproceedings{reimers-2019-sentence-bert,
970
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
971
+ author = "Reimers, Nils and Gurevych, Iryna",
972
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
973
+ month = "11",
974
+ year = "2019",
975
+ publisher = "Association for Computational Linguistics",
976
+ url = "https://arxiv.org/abs/1908.10084",
977
+ }
978
+ ```
979
+
980
+ #### MatryoshkaLoss
981
+ ```bibtex
982
+ @misc{kusupati2024matryoshka,
983
+ title={Matryoshka Representation Learning},
984
+ author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
985
+ year={2024},
986
+ eprint={2205.13147},
987
+ archivePrefix={arXiv},
988
+ primaryClass={cs.LG}
989
+ }
990
+ ```
991
+
992
+ #### MultipleNegativesRankingLoss
993
+ ```bibtex
994
+ @misc{henderson2017efficient,
995
+ title={Efficient Natural Language Response Suggestion for Smart Reply},
996
+ author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
997
+ year={2017},
998
+ eprint={1705.00652},
999
+ archivePrefix={arXiv},
1000
+ primaryClass={cs.CL}
1001
+ }
1002
+ ```
1003
+
1004
+ <!--
1005
+ ## Glossary
1006
+
1007
+ *Clearly define terms in order to be accessible across audiences.*
1008
+ -->
1009
+
1010
+ <!--
1011
+ ## Model Card Authors
1012
+
1013
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
1014
+ -->
1015
+
1016
+ <!--
1017
+ ## Model Card Contact
1018
+
1019
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
1020
+ -->
config_sentence_transformers.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "4.1.0.dev0",
4
+ "transformers": "4.49.0",
5
+ "pytorch": "2.6.0+cu124"
6
+ },
7
+ "prompts": {},
8
+ "default_prompt_name": null,
9
+ "similarity_fn_name": "cosine"
10
+ }
modules.json ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "0_StaticEmbedding",
6
+ "type": "sentence_transformers.models.StaticEmbedding"
7
+ }
8
+ ]