Add new SentenceTransformer model.
Browse files- .gitattributes +1 -0
- 1_Pooling/config.json +7 -0
- 2_Dense/config.json +1 -0
- 2_Dense/pytorch_model.bin +3 -0
- README.md +87 -0
- config.json +31 -0
- config_sentence_transformers.json +7 -0
- eval/similarity_evaluation_results.csv +121 -0
- modules.json +20 -0
- pytorch_model.bin +3 -0
- sentence_bert_config.json +4 -0
- similarity_evaluation_sts-test_results.csv +2 -0
- special_tokens_map.json +1 -0
- tokenizer.json +0 -0
- tokenizer_config.json +1 -0
- vocab.txt +0 -0
.gitattributes
CHANGED
@@ -25,3 +25,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
|
25 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
26 |
*.zstandard filter=lfs diff=lfs merge=lfs -text
|
27 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
|
|
|
25 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
26 |
*.zstandard filter=lfs diff=lfs merge=lfs -text
|
27 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
28 |
+
pytorch_model.bin filter=lfs diff=lfs merge=lfs -text
|
1_Pooling/config.json
ADDED
@@ -0,0 +1,7 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"word_embedding_dimension": 768,
|
3 |
+
"pooling_mode_cls_token": false,
|
4 |
+
"pooling_mode_mean_tokens": true,
|
5 |
+
"pooling_mode_max_tokens": false,
|
6 |
+
"pooling_mode_mean_sqrt_len_tokens": false
|
7 |
+
}
|
2_Dense/config.json
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
{"in_features": 768, "out_features": 256, "bias": true, "activation_function": "torch.nn.modules.activation.Tanh"}
|
2_Dense/pytorch_model.bin
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:675e15721cc8b8525d04a31f2a921c9df14d176c021516685b73dec1cff7ddd2
|
3 |
+
size 788519
|
README.md
ADDED
@@ -0,0 +1,87 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
pipeline_tag: sentence-similarity
|
3 |
+
tags:
|
4 |
+
- sentence-transformers
|
5 |
+
- feature-extraction
|
6 |
+
- sentence-similarity
|
7 |
+
---
|
8 |
+
|
9 |
+
# {MODEL_NAME}
|
10 |
+
|
11 |
+
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 256 dimensional dense vector space and can be used for tasks like clustering or semantic search.
|
12 |
+
|
13 |
+
<!--- Describe your model here -->
|
14 |
+
|
15 |
+
## Usage (Sentence-Transformers)
|
16 |
+
|
17 |
+
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
|
18 |
+
|
19 |
+
```
|
20 |
+
pip install -U sentence-transformers
|
21 |
+
```
|
22 |
+
|
23 |
+
Then you can use the model like this:
|
24 |
+
|
25 |
+
```python
|
26 |
+
from sentence_transformers import SentenceTransformer
|
27 |
+
sentences = ["This is an example sentence", "Each sentence is converted"]
|
28 |
+
|
29 |
+
model = SentenceTransformer('{MODEL_NAME}')
|
30 |
+
embeddings = model.encode(sentences)
|
31 |
+
print(embeddings)
|
32 |
+
```
|
33 |
+
|
34 |
+
|
35 |
+
|
36 |
+
## Evaluation Results
|
37 |
+
|
38 |
+
<!--- Describe how your model was evaluated -->
|
39 |
+
|
40 |
+
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name={MODEL_NAME})
|
41 |
+
|
42 |
+
|
43 |
+
## Training
|
44 |
+
The model was trained with the parameters:
|
45 |
+
|
46 |
+
**DataLoader**:
|
47 |
+
|
48 |
+
`torch.utils.data.dataloader.DataLoader` of length 11 with parameters:
|
49 |
+
```
|
50 |
+
{'batch_size': 15, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
|
51 |
+
```
|
52 |
+
|
53 |
+
**Loss**:
|
54 |
+
|
55 |
+
`sentence_transformers.losses.CosineSimilarityLoss.CosineSimilarityLoss`
|
56 |
+
|
57 |
+
Parameters of the fit()-Method:
|
58 |
+
```
|
59 |
+
{
|
60 |
+
"epochs": 10,
|
61 |
+
"evaluation_steps": 1,
|
62 |
+
"evaluator": "sentence_transformers.evaluation.EmbeddingSimilarityEvaluator.EmbeddingSimilarityEvaluator",
|
63 |
+
"max_grad_norm": 1,
|
64 |
+
"optimizer_class": "<class 'transformers.optimization.AdamW'>",
|
65 |
+
"optimizer_params": {
|
66 |
+
"lr": 2e-05
|
67 |
+
},
|
68 |
+
"scheduler": "WarmupLinear",
|
69 |
+
"steps_per_epoch": null,
|
70 |
+
"warmup_steps": 11,
|
71 |
+
"weight_decay": 0.01
|
72 |
+
}
|
73 |
+
```
|
74 |
+
|
75 |
+
|
76 |
+
## Full Model Architecture
|
77 |
+
```
|
78 |
+
SentenceTransformer(
|
79 |
+
(0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: BertModel
|
80 |
+
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
|
81 |
+
(2): Dense({'in_features': 768, 'out_features': 256, 'bias': True, 'activation_function': 'torch.nn.modules.activation.Tanh'})
|
82 |
+
)
|
83 |
+
```
|
84 |
+
|
85 |
+
## Citing & Authors
|
86 |
+
|
87 |
+
<!--- Describe where people can find more information -->
|
config.json
ADDED
@@ -0,0 +1,31 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"_name_or_path": "bert-base-multilingual-uncased",
|
3 |
+
"architectures": [
|
4 |
+
"BertModel"
|
5 |
+
],
|
6 |
+
"attention_probs_dropout_prob": 0.1,
|
7 |
+
"classifier_dropout": null,
|
8 |
+
"directionality": "bidi",
|
9 |
+
"hidden_act": "gelu",
|
10 |
+
"hidden_dropout_prob": 0.1,
|
11 |
+
"hidden_size": 768,
|
12 |
+
"initializer_range": 0.02,
|
13 |
+
"intermediate_size": 3072,
|
14 |
+
"layer_norm_eps": 1e-12,
|
15 |
+
"max_position_embeddings": 512,
|
16 |
+
"model_type": "bert",
|
17 |
+
"num_attention_heads": 12,
|
18 |
+
"num_hidden_layers": 12,
|
19 |
+
"pad_token_id": 0,
|
20 |
+
"pooler_fc_size": 768,
|
21 |
+
"pooler_num_attention_heads": 12,
|
22 |
+
"pooler_num_fc_layers": 3,
|
23 |
+
"pooler_size_per_head": 128,
|
24 |
+
"pooler_type": "first_token_transform",
|
25 |
+
"position_embedding_type": "absolute",
|
26 |
+
"torch_dtype": "float32",
|
27 |
+
"transformers_version": "4.16.2",
|
28 |
+
"type_vocab_size": 2,
|
29 |
+
"use_cache": true,
|
30 |
+
"vocab_size": 105879
|
31 |
+
}
|
config_sentence_transformers.json
ADDED
@@ -0,0 +1,7 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"__version__": {
|
3 |
+
"sentence_transformers": "2.2.0",
|
4 |
+
"transformers": "4.16.2",
|
5 |
+
"pytorch": "1.10.0+cu111"
|
6 |
+
}
|
7 |
+
}
|
eval/similarity_evaluation_results.csv
ADDED
@@ -0,0 +1,121 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
epoch,steps,cosine_pearson,cosine_spearman,euclidean_pearson,euclidean_spearman,manhattan_pearson,manhattan_spearman,dot_pearson,dot_spearman
|
2 |
+
0,1,0.2723617965841839,0.39798312274382763,0.20072247373635135,0.2733749884662075,0.16921246746819787,0.24413022225819456,0.5021608309158238,0.665000553338728
|
3 |
+
0,2,0.27152130979746364,0.4195988195062719,0.19815032015881917,0.2733749884662075,0.16525942747533215,0.24413022225819456,0.49937775308327026,0.665000553338728
|
4 |
+
0,3,0.2693370018042312,0.4195988195062719,0.19405765015538418,0.2873616157830832,0.1580449862073486,0.24413022225819456,0.49211693983756316,0.6204976482395779
|
5 |
+
0,4,0.27432231886528724,0.4246848658033177,0.19780307161573513,0.2873616157830832,0.1580073963563796,0.24413022225819456,0.48017803360013017,0.6204976482395779
|
6 |
+
0,5,0.2826580420402684,0.455201143585592,0.20604369183045168,0.2873616157830832,0.1636027956587941,0.24413022225819456,0.46645094087111794,0.5836238125859964
|
7 |
+
0,6,0.29153174818859556,0.44248602784297775,0.21139433199584062,0.2873616157830832,0.16726056665435962,0.2733749884662075,0.45109993392142694,0.5581935811007678
|
8 |
+
0,7,0.3117612059669555,0.4653732361796834,0.23056233394606737,0.31533487041683467,0.18685774827768853,0.2873616157830832,0.4279541975525839,0.5302203264670163
|
9 |
+
0,8,0.3330742765521085,0.48698893294212775,0.2561574261405012,0.35856626394172325,0.20940383865487175,0.30134824309995895,0.39182261732529716,0.5162336991501406
|
10 |
+
0,9,0.4224551370743736,0.5658226505463364,0.4113221232005274,0.49588951396195774,0.3764415604010486,0.4068837037636577,0.3223802125865551,0.43739998154593196
|
11 |
+
0,10,0.38119470269748396,0.5454784653581536,0.42274843344598195,0.4513866088628077,0.3876854503322597,0.39798312274382763,0.214450008906603,0.31406335884257325
|
12 |
+
0,11,0.3065885700679216,0.4259563773775791,0.3904142541455985,0.4361284699716705,0.35866433312156704,0.37255289125859903,0.09554024992700103,0.16529650465398593
|
13 |
+
0,-1,0.3065885700679216,0.4259563773775791,0.3904142541455985,0.4361284699716705,0.35866433312156704,0.37255289125859903,0.09554024992700103,0.16529650465398593
|
14 |
+
1,1,0.1943933432472301,0.30134824309995895,0.3174408874703575,0.34839417134763184,0.2827495950162265,0.2949906852286518,-0.2497310364665412,-0.05340348611898007
|
15 |
+
1,2,0.12501858619824255,0.24413022225819456,0.27554603382136966,0.2746465000404689,0.23393622700063213,0.2810040579117761,-0.49379158812699825,-0.39035405329825906
|
16 |
+
1,3,0.08659426212373905,0.21234243290165886,0.24751770739912926,0.28609010420882175,0.2022220354324426,0.2746465000404689,-0.535305209884874,-0.46791625932820624
|
17 |
+
1,4,0.06881140177671863,0.1907267361392145,0.23565201072637978,0.2810040579117761,0.19053226567988926,0.2810040579117761,-0.516364167585159,-0.5047900949817878
|
18 |
+
1,5,0.03540940355611812,0.11062150696074444,0.20892080654079875,0.2810040579117761,0.16308880268973236,0.2810040579117761,-0.44954528699284685,-0.4946180023876964
|
19 |
+
1,6,0.033765329853119024,0.11062150696074444,0.202206225355386,0.2810040579117761,0.15371825234713002,0.2695604537434232,-0.384228743037684,-0.4132412616349648
|
20 |
+
1,7,0.05725381078789925,0.21234243290165886,0.2099428611861031,0.2810040579117761,0.15997596620445914,0.2810040579117761,-0.2862120034186907,-0.2657459190206389
|
21 |
+
1,8,0.06078512948380489,0.21234243290165886,0.2117604716645682,0.28863312735734464,0.16361885687884029,0.2810040579117761,-0.2677740531208836,-0.2657459190206389
|
22 |
+
1,9,0.06117734996092548,0.22124301392148887,0.21349583635888567,0.31660638199109614,0.16918913188597742,0.2492162685552403,-0.30108171694435426,-0.2670174305949003
|
23 |
+
1,10,0.06688583195458593,0.22124301392148887,0.2157748776624316,0.31660638199109614,0.17535780384573338,0.3051627778227432,-0.31384107868905703,-0.32042091671388034
|
24 |
+
1,11,0.055123791339285966,0.07883371760420867,0.2074022330455289,0.32550696301092613,0.16801798594578707,0.2695604537434232,-0.3341118181420859,-0.3661953333872919
|
25 |
+
1,-1,0.055123791339285966,0.07883371760420867,0.2074022330455289,0.32550696301092613,0.16801798594578707,0.2695604537434232,-0.3341118181420859,-0.3661953333872919
|
26 |
+
2,1,0.06473394778444108,0.07883371760420867,0.21126842293324333,0.3433081250505861,0.1737609064154955,0.3000767315256975,-0.3384474621124536,-0.3967116111695662
|
27 |
+
2,2,0.07309461888426398,0.07883371760420867,0.2148823450983278,0.3433081250505861,0.1784175002857297,0.29244766208012896,-0.33725149060560655,-0.39925463431808905
|
28 |
+
2,3,0.07456009787725182,0.11825057640631301,0.21430979435350409,0.3471226597733704,0.17877582509080797,0.2937191736543904,-0.3317479067140301,-0.3661953333872919
|
29 |
+
2,4,0.09175267625249417,0.11825057640631301,0.22576689505024072,0.4081552153379191,0.1911268313358372,0.3700098681100762,-0.3258702845975732,-0.3382220787535404
|
30 |
+
2,5,0.10997748122379543,0.17038255095103164,0.23716157461396173,0.4195988195062719,0.20453632995337753,0.34839417134763184,-0.3070558127961322,-0.32550696301092613
|
31 |
+
2,6,0.1051060364801319,0.20979940975313596,0.23226726757244756,0.4310424236746248,0.20017918567304907,0.34839417134763184,-0.28058209393225053,-0.27718952318899176
|
32 |
+
2,7,0.1109768593830159,0.26320289587211604,0.2334103014401129,0.4310424236746248,0.20319891066246443,0.381453472278429,-0.2387826229894737,-0.15512441205989447
|
33 |
+
2,8,0.12643551735726916,0.39035405329825906,0.2369001872862991,0.4780883519222977,0.20605993825934835,0.4323139352488863,-0.14358934887506924,-0.022887208336705742
|
34 |
+
2,9,0.13401119120330754,0.43358544682314765,0.23674288136907404,0.5454784653581536,0.2071232861333438,0.481902886645082,-0.06742348026210777,0.14622383104006448
|
35 |
+
2,10,0.14028710337460873,0.4450290509915006,0.2358194657776915,0.5480214885066763,0.208821003994841,0.4946180023876964,-0.012546130509426026,0.2822755694860375
|
36 |
+
2,11,0.17540318102777186,0.5162336991501406,0.24851285712810522,0.48444590979360486,0.22402699420886985,0.44248602784297775,0.11539730672606956,0.4310424236746248
|
37 |
+
2,-1,0.17540318102777186,0.5162336991501406,0.24851285712810522,0.48444590979360486,0.22402699420886985,0.44248602784297775,0.11539730672606956,0.4310424236746248
|
38 |
+
3,1,0.1869199916314565,0.4742738171995134,0.2504959591655763,0.43739998154593196,0.2258179736297293,0.3763674259813833,0.16699282119187495,0.4793598634965592
|
39 |
+
3,2,0.18601860959652772,0.44884358571428484,0.2503885579769284,0.4399430046944548,0.22517624314255125,0.3763674259813833,0.17236685927608636,0.4068837037636577
|
40 |
+
3,3,0.18225615733089767,0.46283021303116056,0.25033992624060447,0.5314918380412778,0.22610616238671424,0.42849940052610197,0.15918632091048912,0.4068837037636577
|
41 |
+
3,4,0.1710993725300012,0.5480214885066763,0.24268410643771038,0.5124191644273564,0.2207763756207045,0.4704592824767292,0.12882093842075723,0.3878110301497362
|
42 |
+
3,5,0.16367352311748362,0.5124191644273564,0.23886962644780427,0.5251342801699707,0.21975687399357693,0.5124191644273564,0.10042094498774393,0.3661953333872919
|
43 |
+
3,6,0.14656935077518435,0.5060616065560491,0.22638530772150908,0.5276773033184935,0.20895712845032913,0.5251342801699707,0.036789927859192724,0.3382220787535404
|
44 |
+
3,7,0.1252668642846303,0.4437575394172391,0.2098331355314569,0.4793598634965592,0.19461005125389177,0.43358544682314765,-0.046811310134782756,0.1474953426143259
|
45 |
+
3,8,0.10368051425465545,0.3433081250505861,0.19335895677371973,0.464101724605422,0.1803931271509606,0.41451277320922625,-0.1250099049355098,-0.0012715115742614302
|
46 |
+
3,9,0.0767337768923914,0.09917790279239155,0.17192919593213857,0.45774416673411483,0.16164978865965418,0.44630056256576195,-0.1889526156134819,-0.13986627316875733
|
47 |
+
3,10,0.05404547800859963,0.006357557871307151,0.15288671623196917,0.35220870607041616,0.145261628718692,0.4081552153379191,-0.21603083579501564,-0.1678395278025088
|
48 |
+
3,11,0.05476784704804881,0.006357557871307151,0.1502461423216594,0.3661953333872919,0.14363854136866386,0.42849940052610197,-0.2088727512415622,-0.13986627316875733
|
49 |
+
3,-1,0.05476784704804881,0.006357557871307151,0.1502461423216594,0.3661953333872919,0.14363854136866386,0.42849940052610197,-0.2088727512415622,-0.13986627316875733
|
50 |
+
4,1,0.051639714054161456,0.020344185188182883,0.14524701257156492,0.35220870607041616,0.13938695747663735,0.4094267269121805,-0.20325725583523857,-0.08773429862403868
|
51 |
+
4,2,0.05661595674307914,0.03305930093079718,0.1485270736008929,0.4170557963577491,0.14303949223328996,0.46028718988263767,-0.16922144180432253,-0.05976104399028721
|
52 |
+
4,3,0.06265899856180754,0.21361394447592028,0.15356599468742552,0.4590156783083763,0.1477389457153986,0.39798312274382763,-0.12026511379855122,0.00762906944556858
|
53 |
+
4,4,0.0656597474676127,0.3077058009712661,0.15530342065067765,0.44884358571428484,0.149953467464331,0.4310424236746248,-0.0712707326397536,0.11062150696074444
|
54 |
+
4,5,0.07429295878857324,0.35856626394172325,0.1605813832907864,0.4920749792391734,0.15511113920474653,0.5213197454471864,-0.004306991759246613,0.2975337083771747
|
55 |
+
4,6,0.07168939673086101,0.36238079866450756,0.15546831672732386,0.4742738171995134,0.15003450237431046,0.49970404868474205,-0.0013334664698200808,0.3064342893970047
|
56 |
+
4,7,0.06668562756213939,0.3445796366248476,0.1490722141961013,0.49588951396195774,0.14353904809560517,0.4780883519222977,-0.001200317885120482,0.29244766208012896
|
57 |
+
4,8,0.046813003572496424,0.19962731715904453,0.13187363938919322,0.36873835653581477,0.12804802913046306,0.45647265515985347,-0.07051753441756184,0.12715115742614302
|
58 |
+
4,9,0.035461941305222275,0.12587964585188158,0.12014545440700042,0.37255289125859903,0.11678583545504596,0.4157842847834876,-0.11359454378017365,-0.048317439821934344
|
59 |
+
4,10,0.02703353867025072,0.054674997693241495,0.10946417922751334,0.29244766208012896,0.10675629737300366,0.31406335884257325,-0.1515537269023842,-0.05976104399028721
|
60 |
+
4,11,0.014524322323588986,-0.00890058101983001,0.09343738454105212,0.23141510651558028,0.09249027377198321,0.23395812966410312,-0.19116851840909949,-0.07756220602994723
|
61 |
+
4,-1,0.014524322323588986,-0.00890058101983001,0.09343738454105212,0.23141510651558028,0.09249027377198321,0.23395812966410312,-0.19116851840909949,-0.07756220602994723
|
62 |
+
5,1,0.01000913965241748,-0.0012715115742614302,0.08574829693019523,0.23141510651558028,0.08559416265912986,0.23777266438688743,-0.20137305153811794,-0.10553546066369869
|
63 |
+
5,2,0.004741419026070648,-0.00890058101983001,0.08031274038431056,0.21361394447592028,0.08027465599757022,0.23141510651558028,-0.19993779382718171,-0.10553546066369869
|
64 |
+
5,3,0.0013558490264173276,0.020344185188182883,0.07732325043780942,0.1894552245649531,0.07769506284836727,0.24158719910967172,-0.1982036517499212,-0.10553546066369869
|
65 |
+
5,4,-0.0020598978743988196,0.05340348611898007,0.07380780137544733,0.1894552245649531,0.07430304388351429,0.2352296412383646,-0.20002903506449282,-0.10553546066369869
|
66 |
+
5,5,0.00636754354522943,0.020344185188182883,0.08170439583136582,0.2352296412383646,0.08117486864775043,0.26320289587211604,-0.17938060141616086,-0.07756220602994723
|
67 |
+
5,6,0.027962971309693954,0.1894552245649531,0.10464659733436256,0.3471226597733704,0.10280066277394842,0.36110928709024614,-0.10766631447289772,-0.048317439821934344
|
68 |
+
5,7,0.04781207021787432,0.29244766208012896,0.12396545033874441,0.4195988195062719,0.12059246534734563,0.48444590979360486,-0.021154036060140824,0.11443604168352871
|
69 |
+
5,8,0.07285443940667027,0.48444590979360486,0.14682585749597557,0.517505210724402,0.1414406225676396,0.5213197454471864,0.09003085018636132,0.4984325371104806
|
70 |
+
5,9,0.08482703876144827,0.5276773033184935,0.15796631286607957,0.5047900949817878,0.1511685169649009,0.5111476528530949,0.14085021937280992,0.5391209074868464
|
71 |
+
5,10,0.0866120762548214,0.5416639306353692,0.16169457791118957,0.5403924190611078,0.15415964651905512,0.5543790463779835,0.15812950456884647,0.5086046297045721
|
72 |
+
5,11,0.08725157867154813,0.5632796273978136,0.16546893337958138,0.5187767222986636,0.15753949931367497,0.5111476528530949,0.1598557295056483,0.5086046297045721
|
73 |
+
5,-1,0.08725157867154813,0.5632796273978136,0.16546893337958138,0.5187767222986636,0.15753949931367497,0.5111476528530949,0.1598557295056483,0.5086046297045721
|
74 |
+
6,1,0.08548048869736076,0.5531075348037221,0.166508725377984,0.573451719991905,0.1579798771437692,0.5442069537838921,0.14913813403132498,0.5480214885066763
|
75 |
+
6,2,0.08370739774321072,0.573451719991905,0.16709231637948704,0.5759947431404279,0.15798711671382096,0.5581935811007678,0.13400806978513838,0.5047900949817878
|
76 |
+
6,3,0.08347285230649279,0.5518360232294607,0.16895930827140337,0.5899813704573036,0.15936111407368345,0.5798092778632121,0.11935873476725338,0.5429354422096307
|
77 |
+
6,4,0.0807282891244177,0.4857174213678663,0.166249659135323,0.5505645116551992,0.15639885052862776,0.582352301011735,0.07745214253998337,0.5454784653581536
|
78 |
+
6,5,0.07777879211706773,0.44630056256576195,0.16343387605904378,0.5505645116551992,0.15353803423660622,0.5899813704573036,0.03245946665305255,0.2937191736543904
|
79 |
+
6,6,0.07272310307470728,0.3064342893970047,0.15901364653927255,0.464101724605422,0.14921580820376526,0.5543790463779835,-0.017475405414934916,0.18055464354512307
|
80 |
+
6,7,0.06789651281332085,0.2593883611493318,0.15852612807283106,0.464101724605422,0.14907782739277758,0.5543790463779835,-0.03388152854560199,0.06866162501011723
|
81 |
+
6,8,0.07051566748000847,0.3064342893970047,0.1653490355750244,0.464101724605422,0.15594698220077144,0.5759947431404279,-0.017033015949872592,0.18055464354512307
|
82 |
+
6,9,0.07261672912809267,0.3064342893970047,0.1705929125032956,0.5035185834075263,0.16108556005608463,0.5759947431404279,-0.007926538947114859,0.18055464354512307
|
83 |
+
6,10,0.07535247104904466,0.3344075440307561,0.1750452997477017,0.5403924190611078,0.1658229631272131,0.5759947431404279,0.0005318603590332183,0.18055464354512307
|
84 |
+
6,11,0.08081004897832052,0.4501150972885462,0.17845698670027962,0.5620081158235521,0.16936458885038475,0.546749976932415,0.03280579078522653,0.28481859263456033
|
85 |
+
6,-1,0.08081004897832052,0.4501150972885462,0.17845698670027962,0.5620081158235521,0.16936458885038475,0.546749976932415,0.03280579078522653,0.28481859263456033
|
86 |
+
7,1,0.08654685626680213,0.44630056256576195,0.1804528834128493,0.5759947431404279,0.1714931504683704,0.5607366042492907,0.0589354808185472,0.36238079866450756
|
87 |
+
7,2,0.08637264913751913,0.43867149312019343,0.17804441496631546,0.5620081158235521,0.1692760737561756,0.5391209074868464,0.06401675018403982,0.36238079866450756
|
88 |
+
7,3,0.08234111031091161,0.43867149312019343,0.174608682599553,0.5480214885066763,0.16588411629535899,0.5391209074868464,0.05176870555117991,0.36238079866450756
|
89 |
+
7,4,0.07196711305113837,0.3661953333872919,0.16684411362221777,0.5480214885066763,0.15821491299698462,0.5251342801699707,0.022338805454793623,0.27591801161473034
|
90 |
+
7,5,0.056854126299692595,0.2937191736543904,0.15556013814310227,0.5035185834075263,0.1467927388986446,0.5543790463779835,-0.031775042093216854,0.1856406898421688
|
91 |
+
7,6,0.04542855716955376,0.18691220141643022,0.14587714996490553,0.4742738171995134,0.13671022194898896,0.5403924190611078,-0.07937524159638158,0.03560232407932004
|
92 |
+
7,7,0.03282509180313787,0.11697906483205156,0.13319841134212257,0.42341335422905624,0.12338011461499167,0.4742738171995134,-0.1281190405753703,-0.04450290509915005
|
93 |
+
7,8,0.025757602279561833,0.07120464815864008,0.12483561979478405,0.36238079866450756,0.11450033564905135,0.3801819607041676,-0.15231230482192112,-0.04450290509915005
|
94 |
+
7,9,0.024168621302116312,0.07120464815864008,0.12170577550398402,0.3293214977337104,0.1108806552274751,0.3661953333872919,-0.15958503382747447,-0.07247615973290152
|
95 |
+
7,10,0.02267796262706389,0.07120464815864008,0.1187955864593094,0.3293214977337104,0.10761084934181707,0.3661953333872919,-0.16598576482779168,-0.07247615973290152
|
96 |
+
7,11,0.023388826082099228,0.08137674075273153,0.11797300586381289,0.3763674259813833,0.10620253295807569,0.3801819607041676,-0.16536542070882912,-0.07247615973290152
|
97 |
+
7,-1,0.023388826082099228,0.08137674075273153,0.11797300586381289,0.3763674259813833,0.10620253295807569,0.3801819607041676,-0.16536542070882912,-0.07247615973290152
|
98 |
+
8,1,0.02494409484222196,0.0839197639012544,0.11891626015499254,0.3763674259813833,0.1066560959346107,0.3941685880210433,-0.15927421717955806,-0.04450290509915005
|
99 |
+
8,2,0.03027209103737706,0.1296941805746659,0.12461811536918009,0.42341335422905624,0.1120758389712893,0.42722788895184055,-0.1394948157871992,-0.04450290509915005
|
100 |
+
8,3,0.03430537913969008,0.13350871529745015,0.12925106545245466,0.43739998154593196,0.11693301836468088,0.4933464908134349,-0.1140385856878783,-0.04450290509915005
|
101 |
+
8,4,0.03834274679055679,0.25048778012950174,0.1337550401534354,0.45647265515985347,0.12203471769855208,0.5086046297045721,-0.07866513015845768,0.1296941805746659
|
102 |
+
8,5,0.04261715576610289,0.32296393986240324,0.1380598738209478,0.5149621875758792,0.1268718388933669,0.5238627685957092,-0.04297475329712595,0.18055464354512307
|
103 |
+
8,6,0.046906657092389954,0.36238079866450756,0.1417308580597271,0.5492930000809378,0.13104649336581708,0.5162336991501406,-0.008488369638671523,0.21615696762444314
|
104 |
+
8,7,0.04865731290070672,0.3763674259813833,0.1427782765504746,0.5531075348037221,0.13242010053506065,0.5302203264670163,0.010800153185155946,0.31024882411978894
|
105 |
+
8,8,0.049757412987256326,0.4094267269121805,0.14324673128531962,0.5531075348037221,0.13311303875684358,0.5302203264670163,0.02470947584461457,0.31024882411978894
|
106 |
+
8,9,0.04891872917046294,0.4094267269121805,0.1421535736433913,0.5531075348037221,0.13234069886746605,0.5302203264670163,0.02848763513689862,0.32804998615944897
|
107 |
+
8,10,0.049102259063545525,0.4094267269121805,0.14198535637200904,0.5531075348037221,0.13252479806276432,0.5225912570214478,0.03460523816083461,0.4068837037636577
|
108 |
+
8,11,0.04936391820080126,0.46028718988263767,0.1418841103465879,0.5314918380412778,0.13271331352943874,0.5225912570214478,0.039759511050182744,0.48444590979360486
|
109 |
+
8,-1,0.04936391820080126,0.46028718988263767,0.1418841103465879,0.5314918380412778,0.13271331352943874,0.5225912570214478,0.039759511050182744,0.48444590979360486
|
110 |
+
9,1,0.04843700824903956,0.46028718988263767,0.1407881336411059,0.5314918380412778,0.13187142299429938,0.5225912570214478,0.03989951890979034,0.48444590979360486
|
111 |
+
9,2,0.04851210326510303,0.45774416673411483,0.14073390992317397,0.5314918380412778,0.13196281254677575,0.5225912570214478,0.042680923087821836,0.4704592824767292
|
112 |
+
9,3,0.04811762989815579,0.45774416673411483,0.14034155999150186,0.5314918380412778,0.13172758182011032,0.5009755602590035,0.04279903717690045,0.4704592824767292
|
113 |
+
9,4,0.04627064192172725,0.4348569583974091,0.13891903422544766,0.5314918380412778,0.1303626658950121,0.5009755602590035,0.035326652542178566,0.4195988195062719
|
114 |
+
9,5,0.044542058169610224,0.4348569583974091,0.13765878527282618,0.5531075348037221,0.12910071374703508,0.48698893294212775,0.027591144513336618,0.3293214977337104
|
115 |
+
9,6,0.0424815061119831,0.3763674259813833,0.13609572681072168,0.5531075348037221,0.12748292559719152,0.48698893294212775,0.018363807203107253,0.2962621968029132
|
116 |
+
9,7,0.04035606462388307,0.36238079866450756,0.1343388580764968,0.5531075348037221,0.12564885922391714,0.4946180023876964,0.007921672866498305,0.2962621968029132
|
117 |
+
9,8,0.03806967836104378,0.36238079866450756,0.13246769987445375,0.5492930000809378,0.12371031827221535,0.4946180023876964,-0.0031854413566032394,0.2530308032780246
|
118 |
+
9,9,0.036363545417557396,0.36238079866450756,0.1310487771860485,0.5124191644273564,0.12225348776360867,0.4946180023876964,-0.011949548282365148,0.18436917826790739
|
119 |
+
9,10,0.035275066956678904,0.36238079866450756,0.13012841105938525,0.5124191644273564,0.12130104502591009,0.4882604445163891,-0.017491012666853635,0.16656801622824735
|
120 |
+
9,11,0.03480911531666855,0.34839417134763184,0.1297338290341687,0.4793598634965592,0.12088885220939423,0.4882604445163891,-0.019882883795760784,0.16656801622824735
|
121 |
+
9,-1,0.03480911531666855,0.34839417134763184,0.1297338290341687,0.4793598634965592,0.12088885220939423,0.4882604445163891,-0.019882883795760784,0.16656801622824735
|
modules.json
ADDED
@@ -0,0 +1,20 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
[
|
2 |
+
{
|
3 |
+
"idx": 0,
|
4 |
+
"name": "0",
|
5 |
+
"path": "",
|
6 |
+
"type": "sentence_transformers.models.Transformer"
|
7 |
+
},
|
8 |
+
{
|
9 |
+
"idx": 1,
|
10 |
+
"name": "1",
|
11 |
+
"path": "1_Pooling",
|
12 |
+
"type": "sentence_transformers.models.Pooling"
|
13 |
+
},
|
14 |
+
{
|
15 |
+
"idx": 2,
|
16 |
+
"name": "2",
|
17 |
+
"path": "2_Dense",
|
18 |
+
"type": "sentence_transformers.models.Dense"
|
19 |
+
}
|
20 |
+
]
|
pytorch_model.bin
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:2c444eaba015685d9b72f257b54ca4cdf4d25766e3277abc803efcadad6bc183
|
3 |
+
size 669506993
|
sentence_bert_config.json
ADDED
@@ -0,0 +1,4 @@
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"max_seq_length": 256,
|
3 |
+
"do_lower_case": false
|
4 |
+
}
|
similarity_evaluation_sts-test_results.csv
ADDED
@@ -0,0 +1,2 @@
|
|
|
|
|
|
|
1 |
+
epoch,steps,cosine_pearson,cosine_spearman,euclidean_pearson,euclidean_spearman,manhattan_pearson,manhattan_spearman,dot_pearson,dot_spearman
|
2 |
+
-1,-1,0.7745657088673157,0.5141633254097755,0.7920986800309122,0.5250023563215429,0.7932546853140852,0.5457216432968104,0.7558137932347831,0.2677744398262415
|
special_tokens_map.json
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
{"unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]"}
|
tokenizer.json
ADDED
The diff for this file is too large to render.
See raw diff
|
|
tokenizer_config.json
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
{"do_lower_case": true, "unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]", "tokenize_chinese_chars": true, "strip_accents": null, "model_max_length": 512, "special_tokens_map_file": null, "name_or_path": "bert-base-multilingual-uncased", "tokenizer_class": "BertTokenizer"}
|
vocab.txt
ADDED
The diff for this file is too large to render.
See raw diff
|
|