zemaia commited on
Commit
3ad0475
·
verified ·
1 Parent(s): 1a62a91

End of training

Browse files
Files changed (3) hide show
  1. README.md +159 -57
  2. model.safetensors +1 -1
  3. training_args.bin +1 -1
README.md CHANGED
@@ -15,7 +15,7 @@ should probably proofread and complete it, then remove this comment. -->
15
 
16
  This model is a fine-tuned version of [adalbertojunior/distilbert-portuguese-cased](https://huggingface.co/adalbertojunior/distilbert-portuguese-cased) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
- - Loss: 1.3316
19
 
20
  ## Model description
21
 
@@ -45,62 +45,164 @@ The following hyperparameters were used during training:
45
 
46
  ### Training results
47
 
48
- | Training Loss | Epoch | Step | Validation Loss |
49
- |:-------------:|:-------:|:----:|:---------------:|
50
- | 6.8051 | 1.5385 | 100 | 5.4555 |
51
- | 5.0899 | 3.0769 | 200 | 4.5164 |
52
- | 4.3939 | 4.6154 | 300 | 3.9739 |
53
- | 3.948 | 6.1538 | 400 | 3.6053 |
54
- | 3.6364 | 7.6923 | 500 | 3.3412 |
55
- | 3.391 | 9.2308 | 600 | 3.1273 |
56
- | 3.2013 | 10.7692 | 700 | 2.9847 |
57
- | 3.0531 | 12.3077 | 800 | 2.7862 |
58
- | 2.9261 | 13.8462 | 900 | 2.7109 |
59
- | 2.8132 | 15.3846 | 1000 | 2.5933 |
60
- | 2.698 | 16.9231 | 1100 | 2.5080 |
61
- | 2.614 | 18.4615 | 1200 | 2.4453 |
62
- | 2.5423 | 20.0 | 1300 | 2.3267 |
63
- | 2.4675 | 21.5385 | 1400 | 2.2761 |
64
- | 2.4105 | 23.0769 | 1500 | 2.2385 |
65
- | 2.3405 | 24.6154 | 1600 | 2.1717 |
66
- | 2.2862 | 26.1538 | 1700 | 2.1134 |
67
- | 2.2324 | 27.6923 | 1800 | 2.0816 |
68
- | 2.1954 | 29.2308 | 1900 | 2.0319 |
69
- | 2.145 | 30.7692 | 2000 | 1.9917 |
70
- | 2.1001 | 32.3077 | 2100 | 1.9738 |
71
- | 2.0688 | 33.8462 | 2200 | 1.9215 |
72
- | 2.024 | 35.3846 | 2300 | 1.8910 |
73
- | 1.99 | 36.9231 | 2400 | 1.8697 |
74
- | 1.9524 | 38.4615 | 2500 | 1.8273 |
75
- | 1.9288 | 40.0 | 2600 | 1.7972 |
76
- | 1.8957 | 41.5385 | 2700 | 1.7696 |
77
- | 1.8713 | 43.0769 | 2800 | 1.7486 |
78
- | 1.8427 | 44.6154 | 2900 | 1.7461 |
79
- | 1.8155 | 46.1538 | 3000 | 1.6993 |
80
- | 1.7914 | 47.6923 | 3100 | 1.6781 |
81
- | 1.766 | 49.2308 | 3200 | 1.6483 |
82
- | 1.738 | 50.7692 | 3300 | 1.6227 |
83
- | 1.7167 | 52.3077 | 3400 | 1.5883 |
84
- | 1.6781 | 53.8462 | 3500 | 1.5695 |
85
- | 1.658 | 55.3846 | 3600 | 1.5469 |
86
- | 1.6276 | 56.9231 | 3700 | 1.5451 |
87
- | 1.6131 | 58.4615 | 3800 | 1.5046 |
88
- | 1.5916 | 60.0 | 3900 | 1.4997 |
89
- | 1.5746 | 61.5385 | 4000 | 1.4709 |
90
- | 1.5638 | 63.0769 | 4100 | 1.4615 |
91
- | 1.5395 | 64.6154 | 4200 | 1.4542 |
92
- | 1.5291 | 66.1538 | 4300 | 1.4323 |
93
- | 1.5115 | 67.6923 | 4400 | 1.4036 |
94
- | 1.4885 | 69.2308 | 4500 | 1.3822 |
95
- | 1.4706 | 70.7692 | 4600 | 1.3732 |
96
- | 1.4611 | 72.3077 | 4700 | 1.3625 |
97
- | 1.4393 | 73.8462 | 4800 | 1.3662 |
98
- | 1.4321 | 75.3846 | 4900 | 1.3556 |
99
- | 1.4108 | 76.9231 | 5000 | 1.3566 |
100
- | 1.4054 | 78.4615 | 5100 | 1.2953 |
101
- | 1.3863 | 80.0 | 5200 | 1.3333 |
102
- | 1.3674 | 81.5385 | 5300 | 1.3210 |
103
- | 1.3578 | 83.0769 | 5400 | 1.3029 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
104
 
105
 
106
  ### Framework versions
 
15
 
16
  This model is a fine-tuned version of [adalbertojunior/distilbert-portuguese-cased](https://huggingface.co/adalbertojunior/distilbert-portuguese-cased) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
+ - Loss: 0.7396
19
 
20
  ## Model description
21
 
 
45
 
46
  ### Training results
47
 
48
+ | Training Loss | Epoch | Step | Validation Loss |
49
+ |:-------------:|:--------:|:-----:|:---------------:|
50
+ | 6.7563 | 1.6949 | 100 | 5.4137 |
51
+ | 5.0553 | 3.3898 | 200 | 4.4824 |
52
+ | 4.3687 | 5.0847 | 300 | 3.9332 |
53
+ | 3.9319 | 6.7797 | 400 | 3.5644 |
54
+ | 3.6101 | 8.4746 | 500 | 3.2889 |
55
+ | 3.3843 | 10.1695 | 600 | 3.0760 |
56
+ | 3.1869 | 11.8644 | 700 | 2.9195 |
57
+ | 3.0395 | 13.5593 | 800 | 2.7842 |
58
+ | 2.9038 | 15.2542 | 900 | 2.6563 |
59
+ | 2.7768 | 16.9492 | 1000 | 2.5554 |
60
+ | 2.6835 | 18.6441 | 1100 | 2.4614 |
61
+ | 2.5903 | 20.3390 | 1200 | 2.3882 |
62
+ | 2.5214 | 22.0339 | 1300 | 2.3210 |
63
+ | 2.4401 | 23.7288 | 1400 | 2.2352 |
64
+ | 2.373 | 25.4237 | 1500 | 2.2145 |
65
+ | 2.3147 | 27.1186 | 1600 | 2.1609 |
66
+ | 2.2606 | 28.8136 | 1700 | 2.0704 |
67
+ | 2.2064 | 30.5085 | 1800 | 2.0260 |
68
+ | 2.1572 | 32.2034 | 1900 | 2.0259 |
69
+ | 2.1258 | 33.8983 | 2000 | 1.9498 |
70
+ | 2.0683 | 35.5932 | 2100 | 1.9212 |
71
+ | 2.0374 | 37.2881 | 2200 | 1.8884 |
72
+ | 1.9998 | 38.9831 | 2300 | 1.8543 |
73
+ | 1.9582 | 40.6780 | 2400 | 1.8106 |
74
+ | 1.932 | 42.3729 | 2500 | 1.7822 |
75
+ | 1.8862 | 44.0678 | 2600 | 1.7673 |
76
+ | 1.8677 | 45.7627 | 2700 | 1.7280 |
77
+ | 1.8375 | 47.4576 | 2800 | 1.7147 |
78
+ | 1.8128 | 49.1525 | 2900 | 1.6882 |
79
+ | 1.7874 | 50.8475 | 3000 | 1.6357 |
80
+ | 1.7628 | 52.5424 | 3100 | 1.6502 |
81
+ | 1.7391 | 54.2373 | 3200 | 1.6312 |
82
+ | 1.709 | 55.9322 | 3300 | 1.5989 |
83
+ | 1.6878 | 57.6271 | 3400 | 1.5503 |
84
+ | 1.6605 | 59.3220 | 3500 | 1.5602 |
85
+ | 1.6331 | 61.0169 | 3600 | 1.5486 |
86
+ | 1.6206 | 62.7119 | 3700 | 1.5046 |
87
+ | 1.6057 | 64.4068 | 3800 | 1.5098 |
88
+ | 1.5877 | 66.1017 | 3900 | 1.4885 |
89
+ | 1.5576 | 67.7966 | 4000 | 1.4747 |
90
+ | 1.5413 | 69.4915 | 4100 | 1.4500 |
91
+ | 1.5142 | 71.1864 | 4200 | 1.3917 |
92
+ | 1.4847 | 72.8814 | 4300 | 1.3771 |
93
+ | 1.4665 | 74.5763 | 4400 | 1.3737 |
94
+ | 1.4562 | 76.2712 | 4500 | 1.3560 |
95
+ | 1.4422 | 77.9661 | 4600 | 1.3394 |
96
+ | 1.4148 | 79.6610 | 4700 | 1.3453 |
97
+ | 1.4108 | 81.3559 | 4800 | 1.3261 |
98
+ | 1.3992 | 83.0508 | 4900 | 1.3111 |
99
+ | 1.3784 | 84.7458 | 5000 | 1.3083 |
100
+ | 1.3607 | 86.4407 | 5100 | 1.2982 |
101
+ | 1.352 | 88.1356 | 5200 | 1.2758 |
102
+ | 1.3353 | 89.8305 | 5300 | 1.2818 |
103
+ | 1.3173 | 91.5254 | 5400 | 1.2697 |
104
+ | 1.3085 | 93.2203 | 5500 | 1.2440 |
105
+ | 1.2955 | 94.9153 | 5600 | 1.2099 |
106
+ | 1.2933 | 96.6102 | 5700 | 1.2337 |
107
+ | 1.2757 | 98.3051 | 5800 | 1.2056 |
108
+ | 1.262 | 100.0 | 5900 | 1.1993 |
109
+ | 1.2509 | 101.6949 | 6000 | 1.1933 |
110
+ | 1.2418 | 103.3898 | 6100 | 1.1645 |
111
+ | 1.2275 | 105.0847 | 6200 | 1.1820 |
112
+ | 1.2219 | 106.7797 | 6300 | 1.1452 |
113
+ | 1.216 | 108.4746 | 6400 | 1.1709 |
114
+ | 1.1954 | 110.1695 | 6500 | 1.1386 |
115
+ | 1.1858 | 111.8644 | 6600 | 1.1336 |
116
+ | 1.1799 | 113.5593 | 6700 | 1.1217 |
117
+ | 1.1707 | 115.2542 | 6800 | 1.1102 |
118
+ | 1.1653 | 116.9492 | 6900 | 1.1093 |
119
+ | 1.1476 | 118.6441 | 7000 | 1.1032 |
120
+ | 1.1406 | 120.3390 | 7100 | 1.1004 |
121
+ | 1.1364 | 122.0339 | 7200 | 1.0698 |
122
+ | 1.1173 | 123.7288 | 7300 | 1.0817 |
123
+ | 1.1129 | 125.4237 | 7400 | 1.0825 |
124
+ | 1.1077 | 127.1186 | 7500 | 1.0728 |
125
+ | 1.0943 | 128.8136 | 7600 | 1.0496 |
126
+ | 1.0881 | 130.5085 | 7700 | 1.0443 |
127
+ | 1.0774 | 132.2034 | 7800 | 1.0392 |
128
+ | 1.0789 | 133.8983 | 7900 | 1.0470 |
129
+ | 1.0608 | 135.5932 | 8000 | 1.0248 |
130
+ | 1.0516 | 137.2881 | 8100 | 1.0144 |
131
+ | 1.0533 | 138.9831 | 8200 | 1.0246 |
132
+ | 1.0401 | 140.6780 | 8300 | 1.0180 |
133
+ | 1.0347 | 142.3729 | 8400 | 0.9903 |
134
+ | 1.0268 | 144.0678 | 8500 | 0.9809 |
135
+ | 1.016 | 145.7627 | 8600 | 0.9839 |
136
+ | 1.003 | 147.4576 | 8700 | 0.9870 |
137
+ | 1.0066 | 149.1525 | 8800 | 0.9610 |
138
+ | 1.004 | 150.8475 | 8900 | 0.9488 |
139
+ | 0.9918 | 152.5424 | 9000 | 0.9601 |
140
+ | 0.996 | 154.2373 | 9100 | 0.9660 |
141
+ | 0.9835 | 155.9322 | 9200 | 0.9376 |
142
+ | 0.9801 | 157.6271 | 9300 | 0.9504 |
143
+ | 0.9606 | 159.3220 | 9400 | 0.9482 |
144
+ | 0.9646 | 161.0169 | 9500 | 0.9312 |
145
+ | 0.9637 | 162.7119 | 9600 | 0.9304 |
146
+ | 0.9528 | 164.4068 | 9700 | 0.9270 |
147
+ | 0.9432 | 166.1017 | 9800 | 0.9205 |
148
+ | 0.9398 | 167.7966 | 9900 | 0.9202 |
149
+ | 0.9377 | 169.4915 | 10000 | 0.9167 |
150
+ | 0.9282 | 171.1864 | 10100 | 0.9122 |
151
+ | 0.9118 | 172.8814 | 10200 | 0.9034 |
152
+ | 0.907 | 174.5763 | 10300 | 0.8839 |
153
+ | 0.9152 | 176.2712 | 10400 | 0.8879 |
154
+ | 0.9124 | 177.9661 | 10500 | 0.8885 |
155
+ | 0.9005 | 179.6610 | 10600 | 0.8832 |
156
+ | 0.8979 | 181.3559 | 10700 | 0.8767 |
157
+ | 0.8836 | 183.0508 | 10800 | 0.8886 |
158
+ | 0.882 | 184.7458 | 10900 | 0.8601 |
159
+ | 0.8818 | 186.4407 | 11000 | 0.8713 |
160
+ | 0.8724 | 188.1356 | 11100 | 0.8602 |
161
+ | 0.8688 | 189.8305 | 11200 | 0.8510 |
162
+ | 0.8677 | 191.5254 | 11300 | 0.8401 |
163
+ | 0.8643 | 193.2203 | 11400 | 0.8453 |
164
+ | 0.8638 | 194.9153 | 11500 | 0.8351 |
165
+ | 0.8539 | 196.6102 | 11600 | 0.8460 |
166
+ | 0.852 | 198.3051 | 11700 | 0.8474 |
167
+ | 0.8433 | 200.0 | 11800 | 0.8249 |
168
+ | 0.8394 | 201.6949 | 11900 | 0.8326 |
169
+ | 0.8339 | 203.3898 | 12000 | 0.8331 |
170
+ | 0.8284 | 205.0847 | 12100 | 0.8216 |
171
+ | 0.8284 | 206.7797 | 12200 | 0.8148 |
172
+ | 0.8261 | 208.4746 | 12300 | 0.8020 |
173
+ | 0.8158 | 210.1695 | 12400 | 0.8112 |
174
+ | 0.8148 | 211.8644 | 12500 | 0.8154 |
175
+ | 0.8118 | 213.5593 | 12600 | 0.8058 |
176
+ | 0.8067 | 215.2542 | 12700 | 0.8005 |
177
+ | 0.8022 | 216.9492 | 12800 | 0.8021 |
178
+ | 0.793 | 218.6441 | 12900 | 0.8000 |
179
+ | 0.8003 | 220.3390 | 13000 | 0.7924 |
180
+ | 0.7891 | 222.0339 | 13100 | 0.7891 |
181
+ | 0.7802 | 223.7288 | 13200 | 0.7678 |
182
+ | 0.7906 | 225.4237 | 13300 | 0.7902 |
183
+ | 0.7756 | 227.1186 | 13400 | 0.7774 |
184
+ | 0.7788 | 228.8136 | 13500 | 0.7639 |
185
+ | 0.7654 | 230.5085 | 13600 | 0.7767 |
186
+ | 0.7686 | 232.2034 | 13700 | 0.7831 |
187
+ | 0.7691 | 233.8983 | 13800 | 0.7735 |
188
+ | 0.7656 | 235.5932 | 13900 | 0.7632 |
189
+ | 0.7597 | 237.2881 | 14000 | 0.7694 |
190
+ | 0.7562 | 238.9831 | 14100 | 0.7475 |
191
+ | 0.754 | 240.6780 | 14200 | 0.7585 |
192
+ | 0.7461 | 242.3729 | 14300 | 0.7502 |
193
+ | 0.749 | 244.0678 | 14400 | 0.7533 |
194
+ | 0.7482 | 245.7627 | 14500 | 0.7308 |
195
+ | 0.7436 | 247.4576 | 14600 | 0.7581 |
196
+ | 0.7395 | 249.1525 | 14700 | 0.7118 |
197
+ | 0.7339 | 250.8475 | 14800 | 0.7458 |
198
+ | 0.7337 | 252.5424 | 14900 | 0.7232 |
199
+ | 0.7262 | 254.2373 | 15000 | 0.7421 |
200
+ | 0.7313 | 255.9322 | 15100 | 0.7097 |
201
+ | 0.7223 | 257.6271 | 15200 | 0.7235 |
202
+ | 0.7189 | 259.3220 | 15300 | 0.7222 |
203
+ | 0.7228 | 261.0169 | 15400 | 0.7373 |
204
+ | 0.7163 | 262.7119 | 15500 | 0.7247 |
205
+ | 0.7102 | 264.4068 | 15600 | 0.7255 |
206
 
207
 
208
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:3003b9f0ef33ba8fb1f25e25fd39af65c5b57d5e7f1f193fbd1261a968f70479
3
  size 265721304
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2a93097d33bbc11a783151449a953a4f061235938c9f92a41b4c38b395f4f8d8
3
  size 265721304
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:6bec2265ab9ae8b62c00c6f580115e0773708091c98d996e7c77d2eeba90febb
3
  size 5240
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b665a1bf51abfeac066e530b1a26e87df0501d84d6b8facff0975c1317d4c68c
3
  size 5240