Jrinky commited on
Commit
20e43fa
·
verified ·
1 Parent(s): 9c8f5b0

Add new SentenceTransformer model

Browse files
.gitattributes CHANGED
@@ -33,3 +33,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ tokenizer.json filter=lfs diff=lfs merge=lfs -text
1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 1024,
3
+ "pooling_mode_cls_token": true,
4
+ "pooling_mode_mean_tokens": false,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
README.md ADDED
@@ -0,0 +1,522 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - sentence-transformers
4
+ - sentence-similarity
5
+ - feature-extraction
6
+ - generated_from_trainer
7
+ - dataset_size:20816
8
+ - loss:Infonce
9
+ base_model: BAAI/bge-m3
10
+ widget:
11
+ - source_sentence: What do studies show about the configurations of eukaryotic polysomes
12
+ sentences:
13
+ - "Eukaryotic\n\nIn cells \nin situ (in cell) studies have shown that eukaryotic\
14
+ \ polysomes exhibit linear configurations. Densely packed 3-D helices and planar\
15
+ \ double-row polysomes were found with variable packing including “top-to-top”\
16
+ \ contacts similar to prokaryotic polysomes."
17
+ - Carlo Dante Rota (born 17 April 1961) is a British-born Canadian actor. He has
18
+ appeared in Little Mosque on the Prairie and as systems analyst Morris O'Brian
19
+ on the Fox series 24.
20
+ - 'Ronnie & Jo Wood Still ‘Close Friends’ Despite Joint Property Auction
21
+
22
+ Celebrity auctioneer Darren Julien is gearing up for a massive sale of over 600
23
+ items belonging to Rolling Stones guitarist Ronnie Wood’s and his ex-wife Jo Wood.
24
+ Much like many of Julian’s Auctions past collections, this auction has created
25
+ some controversy because Ronnie has recently come out as opposed to the sale of
26
+ his personal belongings, denying his involvement in the ‘joint’ sale. In response
27
+ to those recent statements coming out Ronnie Wood’s camp saying he’s “shocked
28
+ and disappointed” at the auctioning off his personal belongings, and that the
29
+ auction has been “misrepresented as a joint sale,” Julien claims Ronnie has known
30
+ about the auction since its start.'
31
+ - source_sentence: What was Mike Holober's role at the BMI Jazz Composer’s Workshop
32
+ from 2007 to 2015
33
+ sentences:
34
+ - '- Establishing a named ''link’ person within an organisation with a liaison role
35
+ between service users and the organisation. This can help to reduce the problems
36
+ that can occur with personnel changes or restructuring.'
37
+ - 'A professor of obstetrics from 1895 at Kraków''s Jagiellonian University, Jordan
38
+ became best known for organizing children’s playgrounds, called "Jordan''s parks"
39
+ after him. Life
40
+
41
+ Henryk Jordan was born into an impoverished noble family from the village of Zakliczyn,
42
+ which over time moved to other places in Polish Galicia (for example Przemyśl).
43
+ His father, Bonifacy Jordan, gave private lessons. His mother, Salomea Wędrychowska,
44
+ was a homemaker. Jordan received his high-school education in Tarnopol and Tarnów.
45
+ In 1861, however, he took part in pro-Polish demonstrations for which he was threatened
46
+ with expulsion from school. In 1862 he moved to Trieste and a year later passed
47
+ his high-school examinations, in Italian, with honors. Jordan began his university
48
+ studies in Vienna, and from 1863 continued them at Kraków''s Jagiellonian University.
49
+ He passed his science examinations in 1867 but did not receive his master''s degree
50
+ due to pneumonia.'
51
+ - "From 2007 - 2015 he served as Associate Director of the BMI Jazz Composer’s Workshop,\
52
+ \ where he taught with Musical Director Jim McNeely. Discography \n The Mike Holober\
53
+ \ Quintet, Canyon (Sons of Sound, 2003)\n The Gotham Jazz Orchestra, T Thought\
54
+ \ Trains (Sons of Sound, 2004)\n The Mike Holober Quintet, Wish List (Sons of\
55
+ \ Sound, 2006)\n The Gotham Jazz Orchestra, Quake (Sunnyside, 2009)\n Mike Holober\
56
+ \ & Balancing Act, Balancing Act (Palmetto, 2015)\n The Gotham Jazz Orchestra,\
57
+ \ Hiding Out (Zoho Music, 2019)\n\nReferences\n\nExternal links \n Artist's official\
58
+ \ website\n Sons of Sound, Label page for Mike Holober\n Manhattan School of Music\
59
+ \ faculty profile\n CCNY Stuart Katz Professorship announcement\n Interview with\
60
+ \ WBGO's Gary Walker\n\nVideos\n Westchester Jazz Orchestra - promotional video\
61
+ \ written and directed by Darryl Estrine 2013\n \"Oh No\" - hr-Bigband plays Frank\
62
+ \ Zappa; Deutsches Jazzfestival Frankfurt 2015\n \"We Are Not Alone\" - hr-Bigband\
63
+ \ plays Frank Zappa; Deutsches Jazzfestival Frankfurt 2015\n \"G-Spot Tornado\
64
+ \ - hr-Bigband plays Frank Zappa; Deutsches Jazzfestival Frankfurt 2015\n\n\"\
65
+ Star of Jupiter\" - Kurt Rosenwinkel & hr-Bigband; Kurt Rosenwinkel & hr-Bigband\
66
+ \ im hr-Sendesaal 12.06.2015\n \"Heavenly Bodies\" - Kurt Rosenwinkel & hr-Bigband;\
67
+ \ Kurt Rosenwinkel & hr-Bigband im hr-Sendesaal 12.06.2015\n \"East Coast Love\
68
+ \ Affair\" - Kurt Rosenwinkel & hr-Bigband; Kurt Rosenwinkel & hr-Bigband im hr-Sendesaal\
69
+ \ 12.06.2015\n \"Brooklyn Sometimes\" - Kurt Rosenwinkel & hr-Bigband; Kurt Rosenwinkel\
70
+ \ & hr-Bigband im hr-Sendesaal 12.06.2015\n Al Foster feat. by WDR BIG BAND -\
71
+ \ Douglas (Rehearsal) - WDR rehearsal featuring Al Foster; 04.14.2016\n Al Foster\
72
+ \ feat."
73
+ - source_sentence: What problems does Alice encounter due to her roommate Merv's TV
74
+ watching habits
75
+ sentences:
76
+ - Roommate from hell Merv (Jeremy Strong) is an unrepentant yogurt-pilferer and,
77
+ far worse, the kind of TV addict who likes to "interact" by loudly critiquing
78
+ the very junk he's mainlining. The overwhelming blaring of the television rankles
79
+ Alice (Katie Kreisler), who starts out musing about a part of Vermont that's cut
80
+ off from TV -- and then ends up furiously plotting Merv's ouster.
81
+ - And it does help a bit in public places--there are a few people who will hold
82
+ open doors for me, or offer me other courtesies, as a result of my using the cane.
83
+ It's a real ego-killer to occasionally catch sight of myself, reflected in a plate-glass
84
+ window, stumping along with the cane and lurching from side-to-side.
85
+ - 'That''s an important step in literacy development. Why you''ll like it: I love
86
+ reading this book aloud at story hours.'
87
+ - source_sentence: What was the role of the Sri Lankan High Commissioner in Pretoria,
88
+ South Africa
89
+ sentences:
90
+ - "As the Sri Lankan High Commissioner, he functioned as the executive head of the\
91
+ \ Sri Lankan diplomatic mission in Pretoria, South Africa. Secretary to the Prime\
92
+ \ Minister \nFollowing the Appointment of the new prime minister D.M."
93
+ - xv + 191 pp. + 1 plate.
94
+ - "Winters are generally mild in Alabama, as they are throughout most of the southeastern\
95
+ \ United States, with average January low temperatures around in Mobile, around\
96
+ \ in Huntsville, around in Montgomery, and around in Birmingham. Extremes\n\
97
+ \nPrecipitation\nThe amount of precipitation is greatest along the coast (62 inches/1,574 mm)\
98
+ \ and evenly distributed through the rest of the state (about 52 inches/1,320 mm).\
99
+ \ Much of the rainfall is produced by thunderstorms and, occasionally, by hurricanes\
100
+ \ and other tropical disturbances. In central and northern Alabama, average monthly\
101
+ \ precipitation amounts are highest from November to April, typically peaking\
102
+ \ in December or March, as at Huntsville (December maximum) or Birmingham (March\
103
+ \ maximum), with August to October the driest months. Along the coast, summer\
104
+ \ thunderstorm rains are markedly more frequent and tropical weather systems are\
105
+ \ a threat from July to October. Accordingly, at Mobile, virtually the wettest\
106
+ \ city annually anywhere in the eastern United States (wetter than even Miami,\
107
+ \ FL with its drier winters), monthly average precipitation peaks in July and\
108
+ \ August, but virtually the entire year is wet, with October a slightly drier\
109
+ \ month. Although snow is a rare event in much of Alabama, areas of the state\
110
+ \ north of Montgomery may receive a dusting of snow a few times every winter,\
111
+ \ with an occasional moderately heavy snowfall every few years. Historic heavy\
112
+ \ snowfall events include the New Year's Eve 1963 snowstorm and the 1993 Storm\
113
+ \ of the Century. The annual average snowfall for the Birmingham area is per\
114
+ \ year. In the southern Gulf coast, snowfall is less frequent, sometimes going\
115
+ \ several years without any snowfall. El Niño and La Niña\nDuring El Niño, Alabama\
116
+ \ receives colder than average winter temperatures with wetter than average conditions\
117
+ \ along the southern parts of the state and drier than average conditions in the\
118
+ \ northern parts. La Niña brings warmer than average temperatures with the drier\
119
+ \ weather in the southern parts of the state due to a northern storm track. Hazards\n\
120
+ \nAlabama is also prone to tropical storms and even hurricanes. Areas of the state\
121
+ \ far away from the Gulf are not immune to the effects of the storms, which often\
122
+ \ dump tremendous amounts of rain as they move inland and weaken. Thunderstorms\
123
+ \ are common during the summer throughout Alabama and also occur during other\
124
+ \ times of the year including winter. South Alabama reports many thunderstorms.\
125
+ \ The Gulf Coast, around Mobile Bay, averages between 100 and 110 days per year\
126
+ \ with thunder reported, which eastern and northwest Alabama have 70 to 80 thunderstorm\
127
+ \ days per year. Occasionally, thunderstorms are severe with frequent lightning\
128
+ \ and large hail – the central and northern parts of the state are most vulnerable\
129
+ \ to this type of storm, the northern and central regions of Alabama are especially\
130
+ \ prone to tornadoes. Alabama ranks seventh in the number of deaths from lightning\
131
+ \ and ninth in the number of deaths from lightning strikes per capita. Tornadoes\
132
+ \ occur frequently in Alabama during the spring and fall months, these tornadoes\
133
+ \ can be devastating and even deadly.– these are common throughout the state,\
134
+ \ although the peak season for tornadoes varies from the northern to southern\
135
+ \ parts of the state. Alabama, along with Kansas, has the most reported F5/EF5\
136
+ \ tornadoes than any other state – according to statistics from the National Climatic\
137
+ \ Data Center for the period January 1, 1950, to October 31, 2006. An F5 tornado\
138
+ \ is the most powerful of its kind. Several long – tracked F5 tornadoes have contributed\
139
+ \ to Alabama reporting more tornado fatalities than any other state except for\
140
+ \ Texas and Mississippi. The Super Outbreaks of April 1974 and April 2011 both\
141
+ \ badly affected Alabama. The northern part of the state – along the Tennessee\
142
+ \ Valley – is one of the areas in the US most vulnerable to violent tornadoes\
143
+ \ . The area of Alabama and Mississippi most affected by tornadoes is sometimes\
144
+ \ referred to as Dixie Alley, as distinct from the Tornado Alley of the Southern\
145
+ \ Plains. Alabama is one of the few places in the world that has a secondary tornado\
146
+ \ season (November and December) along with the spring severe weather season.\
147
+ \ See also\nClimate change in Alabama\n\nReferences\n\n \nGeography of Alabama"
148
+ - source_sentence: What is the significance of the first written mention of Metylovice,
149
+ and in which year did it occur
150
+ sentences:
151
+ - Users could also get discounts when they bought the coins in bulk and earn coins
152
+ through certain apps on the Appstore. In 2014, with the release of the Fire Phone,
153
+ Amazon offered app developers 500,000 Amazon Coins for each paid app or app with
154
+ in-app purchasing developed and optimized for the Fire Phone.
155
+ - 'Contents
156
+
157
+ Hard Times moves the Traveller universe forward into a time where the galaxy is
158
+ riven by economic stagnation and collapse of the empire. Rick Swan wrote, "Planets
159
+ are gasping for life like guppies flung from a fish bowl, and the luckless survivors
160
+ face a future of staggering adversity."'
161
+ - 'The Olešná Stream flows through the municipality. History
162
+
163
+ The first written mention of Metylovice is in a deed of Bishop Dětřich from 1299.
164
+ From the second half of the 17th century, tanning developed in the village, thanks
165
+ to which the originally agricultural village began to prosper and grow. Brick
166
+ houses began to replace the original wooden ones and the education and cultural
167
+ life of the inhabitants increased. Sights
168
+
169
+ The most important monument is the Church of All Saints.'
170
+ pipeline_tag: sentence-similarity
171
+ library_name: sentence-transformers
172
+ ---
173
+
174
+ # SentenceTransformer based on BAAI/bge-m3
175
+
176
+ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BAAI/bge-m3](https://huggingface.co/BAAI/bge-m3). It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
177
+
178
+ ## Model Details
179
+
180
+ ### Model Description
181
+ - **Model Type:** Sentence Transformer
182
+ - **Base model:** [BAAI/bge-m3](https://huggingface.co/BAAI/bge-m3) <!-- at revision 5617a9f61b028005a4858fdac845db406aefb181 -->
183
+ - **Maximum Sequence Length:** 1024 tokens
184
+ - **Output Dimensionality:** 1024 dimensions
185
+ - **Similarity Function:** Cosine Similarity
186
+ <!-- - **Training Dataset:** Unknown -->
187
+ <!-- - **Language:** Unknown -->
188
+ <!-- - **License:** Unknown -->
189
+
190
+ ### Model Sources
191
+
192
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
193
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
194
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
195
+
196
+ ### Full Model Architecture
197
+
198
+ ```
199
+ SentenceTransformer(
200
+ (0): Transformer({'max_seq_length': 1024, 'do_lower_case': False}) with Transformer model: XLMRobertaModel
201
+ (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
202
+ (2): Normalize()
203
+ )
204
+ ```
205
+
206
+ ## Usage
207
+
208
+ ### Direct Usage (Sentence Transformers)
209
+
210
+ First install the Sentence Transformers library:
211
+
212
+ ```bash
213
+ pip install -U sentence-transformers
214
+ ```
215
+
216
+ Then you can load this model and run inference.
217
+ ```python
218
+ from sentence_transformers import SentenceTransformer
219
+
220
+ # Download from the 🤗 Hub
221
+ model = SentenceTransformer("Jrinky/model4")
222
+ # Run inference
223
+ sentences = [
224
+ 'What is the significance of the first written mention of Metylovice, and in which year did it occur',
225
+ 'The Olešná Stream flows through the municipality. History\nThe first written mention of Metylovice is in a deed of Bishop Dětřich from 1299. From the second half of the 17th century, tanning developed in the village, thanks to which the originally agricultural village began to prosper and grow. Brick houses began to replace the original wooden ones and the education and cultural life of the inhabitants increased. Sights\nThe most important monument is the Church of All Saints.',
226
+ 'Users could also get discounts when they bought the coins in bulk and earn coins through certain apps on the Appstore. In 2014, with the release of the Fire Phone, Amazon offered app developers 500,000 Amazon Coins for each paid app or app with in-app purchasing developed and optimized for the Fire Phone.',
227
+ ]
228
+ embeddings = model.encode(sentences)
229
+ print(embeddings.shape)
230
+ # [3, 1024]
231
+
232
+ # Get the similarity scores for the embeddings
233
+ similarities = model.similarity(embeddings, embeddings)
234
+ print(similarities.shape)
235
+ # [3, 3]
236
+ ```
237
+
238
+ <!--
239
+ ### Direct Usage (Transformers)
240
+
241
+ <details><summary>Click to see the direct usage in Transformers</summary>
242
+
243
+ </details>
244
+ -->
245
+
246
+ <!--
247
+ ### Downstream Usage (Sentence Transformers)
248
+
249
+ You can finetune this model on your own dataset.
250
+
251
+ <details><summary>Click to expand</summary>
252
+
253
+ </details>
254
+ -->
255
+
256
+ <!--
257
+ ### Out-of-Scope Use
258
+
259
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
260
+ -->
261
+
262
+ <!--
263
+ ## Bias, Risks and Limitations
264
+
265
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
266
+ -->
267
+
268
+ <!--
269
+ ### Recommendations
270
+
271
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
272
+ -->
273
+
274
+ ## Training Details
275
+
276
+ ### Training Dataset
277
+
278
+ #### Unnamed Dataset
279
+
280
+ * Size: 20,816 training samples
281
+ * Columns: <code>anchor</code> and <code>positive</code>
282
+ * Approximate statistics based on the first 1000 samples:
283
+ | | anchor | positive |
284
+ |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
285
+ | type | string | string |
286
+ | details | <ul><li>min: 6 tokens</li><li>mean: 17.92 tokens</li><li>max: 42 tokens</li></ul> | <ul><li>min: 9 tokens</li><li>mean: 168.82 tokens</li><li>max: 1024 tokens</li></ul> |
287
+ * Samples:
288
+ | anchor | positive |
289
+ |:-----------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
290
+ | <code>What was the birth date and place of Helena Binder, also known as Blanche Blotto</code> | <code>Born June 13, 1955 in Batavia, New York. Helena Binder, aka Blanche Blotto (keyboards, vocals; 1978-1980).</code> |
291
+ | <code>What incidents involving Israeli soldiers occurred in the occupied West Bank on Tuesday</code> | <code>Also Tuesday, Israeli soldiers fired a barrage of gas bombs and concussion grenades at a Palestinian home in the Masafer Yatta area, south of Hebron, in the southern part of the occupied West Bank, wounding an entire family, including children. On Tuesday evening, Israeli soldiers invaded the al-Maghayir village northeast of Ramallah, in the central West Bank, after many illegal colonizers attacked Palestinian cars. In related news, the soldiers shot three Palestinian construction workers near the illegal Annexation Wall, west of Hebron, in the southern part of the occupied West Bank, and abducted them.</code> |
292
+ | <code>How was the Mosbrucher Maar formed, and when did it occur</code> | <code>The Mosbrucher Weiher, also called the Mosbrucher Maar, is a silted up maar east of the municipal boundary of the village of Mosbruch in the county Vulkaneifel in Germany. It is located immediately at the foot of the 675-metre-high Hochkelberg, a former volcano. The floor of the maar is in the shape of an elongated oval and is about 700×500 metres in size, its upper boundary has a diameter of about 1,300 × 1,050 metres. This makes the Mosbrucher Maar the third largest of the maars in the western Eifel region. The Üßbach stream flows past and close to the Mosbrucher Weiher. Origin <br>According to pollen analysis studies, the crater was formed about 11,000 years ago by a volcanic eruption. In the area around the maar there are very few volcanic tuffs in comparison to other Eifel maars; only in two places are there greater accumulations of tuff; the rest of the surrounding area is covered only by a thin layer.</code> |
293
+ * Loss: <code>selfloss.Infonce</code> with these parameters:
294
+ ```json
295
+ {
296
+ "scale": 20.0,
297
+ "similarity_fct": "cos_sim"
298
+ }
299
+ ```
300
+
301
+ ### Evaluation Dataset
302
+
303
+ #### Unnamed Dataset
304
+
305
+ * Size: 1,096 evaluation samples
306
+ * Columns: <code>anchor</code> and <code>positive</code>
307
+ * Approximate statistics based on the first 1000 samples:
308
+ | | anchor | positive |
309
+ |:--------|:-----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
310
+ | type | string | string |
311
+ | details | <ul><li>min: 6 tokens</li><li>mean: 18.26 tokens</li><li>max: 574 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 189.9 tokens</li><li>max: 1024 tokens</li></ul> |
312
+ * Samples:
313
+ | anchor | positive |
314
+ |:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
315
+ | <code>What architectural features are present on the front and southern sides of the Martínez Adobe house</code> | <code>The front and southern sides of the house have wooden wrap-around porches at each level. Wood shingles of either cedar or redwood originally covered the roof. The Martínez Adobe is now part of the John Muir National Historic Site and is open to the public. See also<br>California Historical Landmarks in Contra Costa County<br>National Register of Historic Places listings in Contra Costa County, California<br><br>References<br><br>Further reading<br>Feasibility Report John Muir Home and Vicente Martinez Adobe, Martinez, California. (1963). United States: National Park Service, U.S. Department of the Interior. Western Regional Office. Vincent, G., Mariotti, J., Rubin, J. (2009). Pinole. United States: Arcadia Publishing.</code> |
316
+ | <code>What are the cognitive aspects being assessed in relation to TBI, and how do they impact the rehabilitation services for individuals, including warfighters with hearing problems</code> | <code>“Within AASC, we’ve been very proactive as part of interdisciplinary teams assessing TBI. Another area we’re looking at involves cognitive aspects associated with TBI and mild TBI and the best approach to providing rehabilitative services.”<br>As with warfighters who return to duty – including combat – with prosthetic feet or legs, many with hearing problems also want to continue serving rather than accept medical discharges.</code> |
317
+ | <code>What are the benefits mentioned by BIO President & CEO Jim Greenwood regarding the energy title programs in rural America</code> | <code>BIO President & CEO Jim Greenwood said, “The important energy title programs authorized and funded in this bill are just beginning to have a positive impact in revitalizing rural America, fueling economic growth and creating well-paying opportunities where we need it most -- in manufacturing, energy, agriculture and forestry. These programs can also help meet our responsibilities to revitalize rural areas, reduce dependence on foreign oil, and renew economic growth.</code> |
318
+ * Loss: <code>selfloss.Infonce</code> with these parameters:
319
+ ```json
320
+ {
321
+ "scale": 20.0,
322
+ "similarity_fct": "cos_sim"
323
+ }
324
+ ```
325
+
326
+ ### Training Hyperparameters
327
+ #### Non-Default Hyperparameters
328
+
329
+ - `eval_strategy`: steps
330
+ - `per_device_train_batch_size`: 2
331
+ - `per_device_eval_batch_size`: 2
332
+ - `learning_rate`: 2e-05
333
+ - `num_train_epochs`: 5
334
+ - `warmup_ratio`: 0.1
335
+ - `fp16`: True
336
+ - `batch_sampler`: no_duplicates
337
+
338
+ #### All Hyperparameters
339
+ <details><summary>Click to expand</summary>
340
+
341
+ - `overwrite_output_dir`: False
342
+ - `do_predict`: False
343
+ - `eval_strategy`: steps
344
+ - `prediction_loss_only`: True
345
+ - `per_device_train_batch_size`: 2
346
+ - `per_device_eval_batch_size`: 2
347
+ - `per_gpu_train_batch_size`: None
348
+ - `per_gpu_eval_batch_size`: None
349
+ - `gradient_accumulation_steps`: 1
350
+ - `eval_accumulation_steps`: None
351
+ - `learning_rate`: 2e-05
352
+ - `weight_decay`: 0.0
353
+ - `adam_beta1`: 0.9
354
+ - `adam_beta2`: 0.999
355
+ - `adam_epsilon`: 1e-08
356
+ - `max_grad_norm`: 1.0
357
+ - `num_train_epochs`: 5
358
+ - `max_steps`: -1
359
+ - `lr_scheduler_type`: linear
360
+ - `lr_scheduler_kwargs`: {}
361
+ - `warmup_ratio`: 0.1
362
+ - `warmup_steps`: 0
363
+ - `log_level`: passive
364
+ - `log_level_replica`: warning
365
+ - `log_on_each_node`: True
366
+ - `logging_nan_inf_filter`: True
367
+ - `save_safetensors`: True
368
+ - `save_on_each_node`: False
369
+ - `save_only_model`: False
370
+ - `restore_callback_states_from_checkpoint`: False
371
+ - `no_cuda`: False
372
+ - `use_cpu`: False
373
+ - `use_mps_device`: False
374
+ - `seed`: 42
375
+ - `data_seed`: None
376
+ - `jit_mode_eval`: False
377
+ - `use_ipex`: False
378
+ - `bf16`: False
379
+ - `fp16`: True
380
+ - `fp16_opt_level`: O1
381
+ - `half_precision_backend`: auto
382
+ - `bf16_full_eval`: False
383
+ - `fp16_full_eval`: False
384
+ - `tf32`: None
385
+ - `local_rank`: 0
386
+ - `ddp_backend`: None
387
+ - `tpu_num_cores`: None
388
+ - `tpu_metrics_debug`: False
389
+ - `debug`: []
390
+ - `dataloader_drop_last`: False
391
+ - `dataloader_num_workers`: 0
392
+ - `dataloader_prefetch_factor`: None
393
+ - `past_index`: -1
394
+ - `disable_tqdm`: False
395
+ - `remove_unused_columns`: True
396
+ - `label_names`: None
397
+ - `load_best_model_at_end`: False
398
+ - `ignore_data_skip`: False
399
+ - `fsdp`: []
400
+ - `fsdp_min_num_params`: 0
401
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
402
+ - `fsdp_transformer_layer_cls_to_wrap`: None
403
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
404
+ - `deepspeed`: None
405
+ - `label_smoothing_factor`: 0.0
406
+ - `optim`: adamw_torch
407
+ - `optim_args`: None
408
+ - `adafactor`: False
409
+ - `group_by_length`: False
410
+ - `length_column_name`: length
411
+ - `ddp_find_unused_parameters`: None
412
+ - `ddp_bucket_cap_mb`: None
413
+ - `ddp_broadcast_buffers`: False
414
+ - `dataloader_pin_memory`: True
415
+ - `dataloader_persistent_workers`: False
416
+ - `skip_memory_metrics`: True
417
+ - `use_legacy_prediction_loop`: False
418
+ - `push_to_hub`: False
419
+ - `resume_from_checkpoint`: None
420
+ - `hub_model_id`: None
421
+ - `hub_strategy`: every_save
422
+ - `hub_private_repo`: False
423
+ - `hub_always_push`: False
424
+ - `gradient_checkpointing`: False
425
+ - `gradient_checkpointing_kwargs`: None
426
+ - `include_inputs_for_metrics`: False
427
+ - `eval_do_concat_batches`: True
428
+ - `fp16_backend`: auto
429
+ - `push_to_hub_model_id`: None
430
+ - `push_to_hub_organization`: None
431
+ - `mp_parameters`:
432
+ - `auto_find_batch_size`: False
433
+ - `full_determinism`: False
434
+ - `torchdynamo`: None
435
+ - `ray_scope`: last
436
+ - `ddp_timeout`: 1800
437
+ - `torch_compile`: False
438
+ - `torch_compile_backend`: None
439
+ - `torch_compile_mode`: None
440
+ - `dispatch_batches`: None
441
+ - `split_batches`: None
442
+ - `include_tokens_per_second`: False
443
+ - `include_num_input_tokens_seen`: False
444
+ - `neftune_noise_alpha`: None
445
+ - `optim_target_modules`: None
446
+ - `batch_eval_metrics`: False
447
+ - `eval_on_start`: False
448
+ - `prompts`: None
449
+ - `batch_sampler`: no_duplicates
450
+ - `multi_dataset_batch_sampler`: proportional
451
+
452
+ </details>
453
+
454
+ ### Training Logs
455
+ | Epoch | Step | Training Loss | Validation Loss |
456
+ |:------:|:----:|:-------------:|:---------------:|
457
+ | 0.0961 | 100 | 0.2849 | 0.0915 |
458
+ | 0.1921 | 200 | 0.0963 | 0.0511 |
459
+ | 0.2882 | 300 | 0.069 | 0.0459 |
460
+ | 0.3842 | 400 | 0.0622 | 0.0445 |
461
+ | 0.4803 | 500 | 0.0544 | 0.0441 |
462
+ | 0.5764 | 600 | 0.0615 | 0.0418 |
463
+ | 0.6724 | 700 | 0.0573 | 0.0416 |
464
+ | 0.7685 | 800 | 0.0524 | 0.0435 |
465
+ | 0.8646 | 900 | 0.0523 | 0.0398 |
466
+
467
+
468
+ ### Framework Versions
469
+ - Python: 3.12.3
470
+ - Sentence Transformers: 3.4.0
471
+ - Transformers: 4.42.4
472
+ - PyTorch: 2.2.0+cu121
473
+ - Accelerate: 1.3.0
474
+ - Datasets: 3.2.0
475
+ - Tokenizers: 0.19.1
476
+
477
+ ## Citation
478
+
479
+ ### BibTeX
480
+
481
+ #### Sentence Transformers
482
+ ```bibtex
483
+ @inproceedings{reimers-2019-sentence-bert,
484
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
485
+ author = "Reimers, Nils and Gurevych, Iryna",
486
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
487
+ month = "11",
488
+ year = "2019",
489
+ publisher = "Association for Computational Linguistics",
490
+ url = "https://arxiv.org/abs/1908.10084",
491
+ }
492
+ ```
493
+
494
+ #### Infonce
495
+ ```bibtex
496
+ @misc{henderson2017efficient,
497
+ title={Efficient Natural Language Response Suggestion for Smart Reply},
498
+ author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
499
+ year={2017},
500
+ eprint={1705.00652},
501
+ archivePrefix={arXiv},
502
+ primaryClass={cs.CL}
503
+ }
504
+ ```
505
+
506
+ <!--
507
+ ## Glossary
508
+
509
+ *Clearly define terms in order to be accessible across audiences.*
510
+ -->
511
+
512
+ <!--
513
+ ## Model Card Authors
514
+
515
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
516
+ -->
517
+
518
+ <!--
519
+ ## Model Card Contact
520
+
521
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
522
+ -->
config.json ADDED
@@ -0,0 +1,28 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "/root/autodl-tmp/work_dir/models/model3/checkpoint-900",
3
+ "architectures": [
4
+ "XLMRobertaModel"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "bos_token_id": 0,
8
+ "classifier_dropout": null,
9
+ "eos_token_id": 2,
10
+ "hidden_act": "gelu",
11
+ "hidden_dropout_prob": 0.1,
12
+ "hidden_size": 1024,
13
+ "initializer_range": 0.02,
14
+ "intermediate_size": 4096,
15
+ "layer_norm_eps": 1e-05,
16
+ "max_position_embeddings": 8194,
17
+ "model_type": "xlm-roberta",
18
+ "num_attention_heads": 16,
19
+ "num_hidden_layers": 24,
20
+ "output_past": true,
21
+ "pad_token_id": 1,
22
+ "position_embedding_type": "absolute",
23
+ "torch_dtype": "float32",
24
+ "transformers_version": "4.42.4",
25
+ "type_vocab_size": 1,
26
+ "use_cache": true,
27
+ "vocab_size": 250002
28
+ }
config_sentence_transformers.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "3.4.0",
4
+ "transformers": "4.42.4",
5
+ "pytorch": "2.2.0+cu121"
6
+ },
7
+ "prompts": {},
8
+ "default_prompt_name": null,
9
+ "similarity_fn_name": "cosine"
10
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:61afc05145a53259487171b5d77d32c2d951f8fe278a2bc8ce6df58a275cf850
3
+ size 2271064456
modules.json ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ },
14
+ {
15
+ "idx": 2,
16
+ "name": "2",
17
+ "path": "2_Normalize",
18
+ "type": "sentence_transformers.models.Normalize"
19
+ }
20
+ ]
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 1024,
3
+ "do_lower_case": false
4
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<s>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "cls_token": {
10
+ "content": "<s>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "eos_token": {
17
+ "content": "</s>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "mask_token": {
24
+ "content": "<mask>",
25
+ "lstrip": true,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "pad_token": {
31
+ "content": "<pad>",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ },
37
+ "sep_token": {
38
+ "content": "</s>",
39
+ "lstrip": false,
40
+ "normalized": false,
41
+ "rstrip": false,
42
+ "single_word": false
43
+ },
44
+ "unk_token": {
45
+ "content": "<unk>",
46
+ "lstrip": false,
47
+ "normalized": false,
48
+ "rstrip": false,
49
+ "single_word": false
50
+ }
51
+ }
tokenizer.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6e3b8957de04e3a4ed42b1a11381556f9adad8d0d502b9dd071c75f626b28f40
3
+ size 17083053
tokenizer_config.json ADDED
@@ -0,0 +1,62 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "<s>",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "1": {
12
+ "content": "<pad>",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "2": {
20
+ "content": "</s>",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "3": {
28
+ "content": "<unk>",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "250001": {
36
+ "content": "<mask>",
37
+ "lstrip": true,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ }
43
+ },
44
+ "bos_token": "<s>",
45
+ "clean_up_tokenization_spaces": true,
46
+ "cls_token": "<s>",
47
+ "eos_token": "</s>",
48
+ "mask_token": "<mask>",
49
+ "max_length": 1024,
50
+ "model_max_length": 1024,
51
+ "pad_to_multiple_of": null,
52
+ "pad_token": "<pad>",
53
+ "pad_token_type_id": 0,
54
+ "padding_side": "right",
55
+ "sep_token": "</s>",
56
+ "sp_model_kwargs": {},
57
+ "stride": 0,
58
+ "tokenizer_class": "XLMRobertaTokenizer",
59
+ "truncation_side": "right",
60
+ "truncation_strategy": "longest_first",
61
+ "unk_token": "<unk>"
62
+ }