Coaster41 commited on
Commit
ff662ad
·
verified ·
1 Parent(s): 23f71c7

Model save

Browse files
Files changed (2) hide show
  1. README.md +109 -28
  2. model.safetensors +1 -1
README.md CHANGED
@@ -14,11 +14,11 @@ should probably proofread and complete it, then remove this comment. -->
14
 
15
  This model is a fine-tuned version of [](https://huggingface.co/) on an unknown dataset.
16
  It achieves the following results on the evaluation set:
17
- - Loss: 0.1559
18
- - Mse: 274.9792
19
- - Mae: 0.6475
20
- - Rmse: 16.5825
21
- - Smape: 90.3816
22
 
23
  ## Model description
24
 
@@ -50,29 +50,110 @@ The following hyperparameters were used during training:
50
 
51
  ### Training results
52
 
53
- | Training Loss | Epoch | Step | Validation Loss | Mse | Mae | Rmse | Smape |
54
- |:-------------:|:------:|:-----:|:---------------:|:--------:|:------:|:-------:|:--------:|
55
- | 0.1858 | 0.0952 | 1000 | 0.1782 | 518.9319 | 0.7698 | 22.7801 | 100.2200 |
56
- | 0.1749 | 0.1904 | 2000 | 0.1696 | 427.1324 | 0.7240 | 20.6672 | 428.4419 |
57
- | 0.1741 | 0.2857 | 3000 | 0.1660 | 502.4791 | 0.7060 | 22.4160 | 93.6057 |
58
- | 0.1705 | 0.3809 | 4000 | 0.1647 | 423.8481 | 0.6876 | 20.5876 | 92.3590 |
59
- | 0.1657 | 0.4761 | 5000 | 0.1636 | 366.2942 | 0.6758 | 19.1388 | 86.3451 |
60
- | 0.1672 | 0.5713 | 6000 | 0.1611 | 365.8203 | 0.6785 | 19.1264 | 126.2944 |
61
- | 0.1621 | 0.6666 | 7000 | 0.1610 | 336.1272 | 0.6763 | 18.3338 | 118.4768 |
62
- | 0.1641 | 0.7618 | 8000 | 0.1598 | 392.5134 | 0.6727 | 19.8120 | 102.6643 |
63
- | 0.1607 | 0.8570 | 9000 | 0.1594 | 292.0475 | 0.6590 | 17.0894 | 114.6111 |
64
- | 0.1615 | 0.9522 | 10000 | 0.1584 | 298.7156 | 0.6652 | 17.2834 | 79.4739 |
65
- | 0.1585 | 1.0474 | 11000 | 0.1583 | 273.0431 | 0.6594 | 16.5240 | 92.2265 |
66
- | 0.16 | 1.1426 | 12000 | 0.1574 | 288.4399 | 0.6531 | 16.9835 | 94.1800 |
67
- | 0.1582 | 1.2379 | 13000 | 0.1581 | 252.3872 | 0.6558 | 15.8867 | 177.7797 |
68
- | 0.1563 | 1.3331 | 14000 | 0.1572 | 283.0400 | 0.6637 | 16.8238 | 76.9167 |
69
- | 0.1583 | 1.4283 | 15000 | 0.1567 | 315.5172 | 0.6533 | 17.7628 | 137.8005 |
70
- | 0.1591 | 1.5235 | 16000 | 0.1573 | 324.9469 | 0.6659 | 18.0263 | 73.2071 |
71
- | 0.1574 | 1.6188 | 17000 | 0.1562 | 369.6110 | 0.6552 | 19.2253 | 77.7899 |
72
- | 0.1592 | 1.7140 | 18000 | 0.1554 | 318.4736 | 0.6430 | 17.8458 | 81.0331 |
73
- | 0.1554 | 1.8092 | 19000 | 0.1560 | 277.4202 | 0.6529 | 16.6559 | 83.8692 |
74
- | 0.1588 | 1.9044 | 20000 | 0.1562 | 287.6734 | 0.6511 | 16.9609 | 73.7841 |
75
- | 0.1543 | 1.9997 | 21000 | 0.1559 | 274.9792 | 0.6475 | 16.5825 | 90.3816 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
76
 
77
 
78
  ### Framework versions
 
14
 
15
  This model is a fine-tuned version of [](https://huggingface.co/) on an unknown dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 0.2566
18
+ - Mse: 4643.9019
19
+ - Mae: 1.1228
20
+ - Rmse: 68.1462
21
+ - Smape: nan
22
 
23
  ## Model description
24
 
 
50
 
51
  ### Training results
52
 
53
+ | Training Loss | Epoch | Step | Validation Loss | Mse | Mae | Rmse | Smape |
54
+ |:-------------:|:------:|:------:|:---------------:|:---------:|:------:|:-------:|:--------:|
55
+ | 0.3084 | 0.0980 | 1000 | 0.3024 | 5346.5903 | 1.2307 | 73.1204 | 153.0007 |
56
+ | 0.2929 | 0.1960 | 2000 | 0.2897 | 5068.2759 | 1.1988 | 71.1918 | nan |
57
+ | 0.2907 | 0.2940 | 3000 | 0.2848 | 4958.8062 | 1.1856 | 70.4188 | nan |
58
+ | 0.2839 | 0.3920 | 4000 | 0.2819 | 4832.9458 | 1.1795 | 69.5194 | nan |
59
+ | 0.2839 | 0.4901 | 5000 | 0.2792 | 4758.4775 | 1.1657 | 68.9817 | 173.3474 |
60
+ | 0.2799 | 0.5881 | 6000 | 0.2773 | 4732.7681 | 1.1655 | 68.7951 | 154.0682 |
61
+ | 0.2812 | 0.6861 | 7000 | 0.2756 | 4791.2764 | 1.1631 | 69.2190 | inf |
62
+ | 0.2762 | 0.7841 | 8000 | 0.2742 | 4711.9121 | 1.1609 | 68.6434 | 150.8088 |
63
+ | 0.2775 | 0.8821 | 9000 | 0.2745 | 4661.4844 | 1.1561 | 68.2751 | nan |
64
+ | 0.2739 | 0.9801 | 10000 | 0.2726 | 4697.7852 | 1.1541 | 68.5404 | nan |
65
+ | 0.2763 | 1.0781 | 11000 | 0.2718 | 4672.8022 | 1.1511 | 68.3579 | nan |
66
+ | 0.2736 | 1.1761 | 12000 | 0.2712 | 4646.1968 | 1.1528 | 68.1630 | 144.7422 |
67
+ | 0.2727 | 1.2741 | 13000 | 0.2699 | 4704.0283 | 1.1525 | 68.5859 | nan |
68
+ | 0.2705 | 1.3721 | 14000 | 0.2694 | 4732.0859 | 1.1485 | 68.7902 | nan |
69
+ | 0.2718 | 1.4702 | 15000 | 0.2691 | 4672.9360 | 1.1495 | 68.3589 | nan |
70
+ | 0.2714 | 1.5682 | 16000 | 0.2687 | 4653.2021 | 1.1452 | 68.2144 | nan |
71
+ | 0.2721 | 1.6662 | 17000 | 0.2677 | 4653.2695 | 1.1435 | 68.2149 | 263.4695 |
72
+ | 0.2694 | 1.7642 | 18000 | 0.2687 | 4744.7812 | 1.1485 | 68.8824 | nan |
73
+ | 0.2707 | 1.8622 | 19000 | 0.2680 | 4563.2783 | 1.1418 | 67.5520 | nan |
74
+ | 0.2701 | 1.9602 | 20000 | 0.2670 | 4582.1411 | 1.1379 | 67.6915 | nan |
75
+ | 0.2687 | 2.0582 | 21000 | 0.2666 | 4580.3374 | 1.1404 | 67.6782 | 203.8995 |
76
+ | 0.2698 | 2.1562 | 22000 | 0.2663 | 4663.5762 | 1.1435 | 68.2904 | nan |
77
+ | 0.2684 | 2.2542 | 23000 | 0.2663 | 4542.9971 | 1.1381 | 67.4018 | 160.5439 |
78
+ | 0.2692 | 2.3522 | 24000 | 0.2656 | 4628.6294 | 1.1399 | 68.0340 | 143.5725 |
79
+ | 0.2692 | 2.4503 | 25000 | 0.2654 | 4611.9399 | 1.1401 | 67.9113 | nan |
80
+ | 0.2674 | 2.5483 | 26000 | 0.2650 | 4631.6226 | 1.1373 | 68.0560 | nan |
81
+ | 0.2676 | 2.6463 | 27000 | 0.2652 | 4656.6133 | 1.1404 | 68.2394 | 261.2106 |
82
+ | 0.2701 | 2.7443 | 28000 | 0.2645 | 4583.7627 | 1.1386 | 67.7035 | nan |
83
+ | 0.2676 | 2.8423 | 29000 | 0.2645 | 4596.8721 | 1.1331 | 67.8002 | 147.3792 |
84
+ | 0.2671 | 2.9403 | 30000 | 0.2642 | 4660.9785 | 1.1388 | 68.2714 | nan |
85
+ | 0.2666 | 3.0383 | 31000 | 0.2646 | 4729.0498 | 1.1405 | 68.7681 | nan |
86
+ | 0.267 | 3.1363 | 32000 | 0.2638 | 4698.4321 | 1.1409 | 68.5451 | nan |
87
+ | 0.2663 | 3.2343 | 33000 | 0.2641 | 4621.0 | 1.1369 | 67.9779 | nan |
88
+ | 0.2642 | 3.3324 | 34000 | 0.2632 | 4717.3887 | 1.1363 | 68.6832 | nan |
89
+ | 0.2686 | 3.4304 | 35000 | 0.2633 | 4576.2842 | 1.1339 | 67.6482 | nan |
90
+ | 0.2639 | 3.5284 | 36000 | 0.2635 | 4640.9990 | 1.1401 | 68.1249 | 145.8278 |
91
+ | 0.2668 | 3.6264 | 37000 | 0.2631 | 4652.9268 | 1.1371 | 68.2124 | nan |
92
+ | 0.2685 | 3.7244 | 38000 | 0.2627 | 4674.4717 | 1.1338 | 68.3701 | nan |
93
+ | 0.2651 | 3.8224 | 39000 | 0.2626 | 4667.7871 | 1.1355 | 68.3212 | nan |
94
+ | 0.2637 | 3.9204 | 40000 | 0.2626 | 4603.3242 | 1.1337 | 67.8478 | 128.9741 |
95
+ | 0.2655 | 4.0184 | 41000 | 0.2623 | 4671.7549 | 1.1358 | 68.3502 | nan |
96
+ | 0.2657 | 4.1164 | 42000 | 0.2621 | 4639.7461 | 1.1318 | 68.1157 | nan |
97
+ | 0.2637 | 4.2144 | 43000 | 0.2619 | 4658.6704 | 1.1330 | 68.2545 | nan |
98
+ | 0.2649 | 4.3125 | 44000 | 0.2621 | 4717.4502 | 1.1337 | 68.6837 | nan |
99
+ | 0.2651 | 4.4105 | 45000 | 0.2617 | 4616.3667 | 1.1287 | 67.9438 | nan |
100
+ | 0.2648 | 4.5085 | 46000 | 0.2615 | 4641.1528 | 1.1327 | 68.1260 | 164.7345 |
101
+ | 0.2646 | 4.6065 | 47000 | 0.2616 | 4634.9507 | 1.1343 | 68.0805 | nan |
102
+ | 0.2619 | 4.7045 | 48000 | 0.2612 | 4717.5820 | 1.1327 | 68.6847 | nan |
103
+ | 0.2641 | 4.8025 | 49000 | 0.2612 | 4671.6055 | 1.1355 | 68.3491 | nan |
104
+ | 0.264 | 4.9005 | 50000 | 0.2613 | 4625.3916 | 1.1307 | 68.0102 | nan |
105
+ | 0.264 | 4.9985 | 51000 | 0.2607 | 4600.4443 | 1.1309 | 67.8266 | nan |
106
+ | 0.2646 | 5.0965 | 52000 | 0.2607 | 4653.5298 | 1.1327 | 68.2168 | nan |
107
+ | 0.2643 | 5.1946 | 53000 | 0.2604 | 4582.6050 | 1.1316 | 67.6949 | nan |
108
+ | 0.263 | 5.2926 | 54000 | 0.2607 | 4624.2041 | 1.1305 | 68.0015 | nan |
109
+ | 0.264 | 5.3906 | 55000 | 0.2604 | 4654.0234 | 1.1305 | 68.2204 | nan |
110
+ | 0.2621 | 5.4886 | 56000 | 0.2601 | 4626.9565 | 1.1290 | 68.0217 | nan |
111
+ | 0.2641 | 5.5866 | 57000 | 0.2604 | 4636.6865 | 1.1318 | 68.0932 | nan |
112
+ | 0.2649 | 5.6846 | 58000 | 0.2600 | 4662.0747 | 1.1305 | 68.2794 | nan |
113
+ | 0.2637 | 5.7826 | 59000 | 0.2599 | 4631.6445 | 1.1295 | 68.0562 | nan |
114
+ | 0.2632 | 5.8806 | 60000 | 0.2598 | 4632.5400 | 1.1311 | 68.0628 | 146.8735 |
115
+ | 0.263 | 5.9786 | 61000 | 0.2596 | 4634.8896 | 1.1297 | 68.0800 | nan |
116
+ | 0.2626 | 6.0766 | 62000 | 0.2598 | 4677.4688 | 1.1307 | 68.3920 | nan |
117
+ | 0.2623 | 6.1747 | 63000 | 0.2596 | 4674.2075 | 1.1313 | 68.3682 | nan |
118
+ | 0.2646 | 6.2727 | 64000 | 0.2595 | 4665.5918 | 1.1310 | 68.3051 | nan |
119
+ | 0.2631 | 6.3707 | 65000 | 0.2593 | 4672.3618 | 1.1285 | 68.3547 | nan |
120
+ | 0.2623 | 6.4687 | 66000 | 0.2593 | 4666.8711 | 1.1299 | 68.3145 | nan |
121
+ | 0.2636 | 6.5667 | 67000 | 0.2594 | 4603.8647 | 1.1279 | 67.8518 | nan |
122
+ | 0.262 | 6.6647 | 68000 | 0.2590 | 4614.9053 | 1.1276 | 67.9331 | nan |
123
+ | 0.2616 | 6.7627 | 69000 | 0.2591 | 4621.3652 | 1.1286 | 67.9806 | nan |
124
+ | 0.2623 | 6.8607 | 70000 | 0.2587 | 4653.7485 | 1.1297 | 68.2184 | nan |
125
+ | 0.2606 | 6.9587 | 71000 | 0.2588 | 4616.5127 | 1.1265 | 67.9449 | nan |
126
+ | 0.2625 | 7.0567 | 72000 | 0.2588 | 4605.3052 | 1.1267 | 67.8624 | nan |
127
+ | 0.2616 | 7.1548 | 73000 | 0.2586 | 4632.1304 | 1.1258 | 68.0598 | nan |
128
+ | 0.261 | 7.2528 | 74000 | 0.2584 | 4635.8457 | 1.1264 | 68.0870 | nan |
129
+ | 0.2624 | 7.3508 | 75000 | 0.2583 | 4700.9951 | 1.1278 | 68.5638 | nan |
130
+ | 0.2604 | 7.4488 | 76000 | 0.2583 | 4718.2422 | 1.1279 | 68.6895 | nan |
131
+ | 0.2611 | 7.5468 | 77000 | 0.2583 | 4681.1753 | 1.1288 | 68.4191 | nan |
132
+ | 0.2603 | 7.6448 | 78000 | 0.2581 | 4658.5015 | 1.1270 | 68.2532 | nan |
133
+ | 0.263 | 7.7428 | 79000 | 0.2581 | 4655.0098 | 1.1287 | 68.2276 | nan |
134
+ | 0.2593 | 7.8408 | 80000 | 0.2579 | 4674.6558 | 1.1277 | 68.3715 | nan |
135
+ | 0.2616 | 7.9388 | 81000 | 0.2580 | 4662.5923 | 1.1278 | 68.2832 | nan |
136
+ | 0.2623 | 8.0369 | 82000 | 0.2578 | 4622.4644 | 1.1260 | 67.9887 | nan |
137
+ | 0.2599 | 8.1349 | 83000 | 0.2578 | 4693.6475 | 1.1265 | 68.5102 | nan |
138
+ | 0.2606 | 8.2329 | 84000 | 0.2578 | 4664.7837 | 1.1271 | 68.2992 | nan |
139
+ | 0.2604 | 8.3309 | 85000 | 0.2576 | 4674.7075 | 1.1273 | 68.3718 | nan |
140
+ | 0.2605 | 8.4289 | 86000 | 0.2576 | 4615.8818 | 1.1244 | 67.9403 | nan |
141
+ | 0.2598 | 8.5269 | 87000 | 0.2575 | 4681.8462 | 1.1254 | 68.4240 | nan |
142
+ | 0.2606 | 8.6249 | 88000 | 0.2575 | 4662.5674 | 1.1251 | 68.2830 | nan |
143
+ | 0.2602 | 8.7229 | 89000 | 0.2573 | 4656.4746 | 1.1238 | 68.2384 | nan |
144
+ | 0.2588 | 8.8209 | 90000 | 0.2573 | 4647.8911 | 1.1249 | 68.1754 | nan |
145
+ | 0.2617 | 8.9189 | 91000 | 0.2573 | 4655.3872 | 1.1245 | 68.2304 | nan |
146
+ | 0.2617 | 9.0170 | 92000 | 0.2573 | 4648.3506 | 1.1258 | 68.1788 | nan |
147
+ | 0.2601 | 9.1150 | 93000 | 0.2571 | 4630.1865 | 1.1232 | 68.0455 | nan |
148
+ | 0.2598 | 9.2130 | 94000 | 0.2570 | 4677.0566 | 1.1245 | 68.3890 | nan |
149
+ | 0.2609 | 9.3110 | 95000 | 0.2570 | 4628.4395 | 1.1244 | 68.0326 | nan |
150
+ | 0.2596 | 9.4090 | 96000 | 0.2569 | 4638.3027 | 1.1245 | 68.1051 | nan |
151
+ | 0.2588 | 9.5070 | 97000 | 0.2568 | 4655.1597 | 1.1249 | 68.2287 | nan |
152
+ | 0.2613 | 9.6050 | 98000 | 0.2567 | 4639.6914 | 1.1242 | 68.1153 | nan |
153
+ | 0.2593 | 9.7030 | 99000 | 0.2568 | 4654.0952 | 1.1250 | 68.2209 | nan |
154
+ | 0.2593 | 9.8010 | 100000 | 0.2567 | 4644.1465 | 1.1229 | 68.1480 | 165.2788 |
155
+ | 0.261 | 9.8990 | 101000 | 0.2567 | 4641.4990 | 1.1234 | 68.1285 | nan |
156
+ | 0.2587 | 9.9971 | 102000 | 0.2566 | 4643.9019 | 1.1228 | 68.1462 | nan |
157
 
158
 
159
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:90fd18bbfc8adeab6a12b10d6ecdb01ff8d7a3f39870dbd393df5bb225251122
3
  size 7966064
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:be00b39e178e8dd058fd285bbfe820d1ff13f671bf054486aed2cf663708b429
3
  size 7966064