End of training
Browse files- README.md +103 -103
- pytorch_model.bin +1 -1
- training_args.bin +2 -2
README.md
CHANGED
@@ -17,14 +17,14 @@ should probably proofread and complete it, then remove this comment. -->
|
|
17 |
|
18 |
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the JCAI2000/100By100BranchPNG dataset.
|
19 |
It achieves the following results on the evaluation set:
|
20 |
-
- Loss: 0.
|
21 |
-
- Mean Iou: 0.
|
22 |
-
- Mean Accuracy: 0.
|
23 |
-
- Overall Accuracy: 0.
|
24 |
-
- Accuracy Background:
|
25 |
-
- Accuracy Branch: 0.
|
26 |
-
- Iou Background: 0.
|
27 |
-
- Iou Branch: 0.
|
28 |
|
29 |
## Model description
|
30 |
|
@@ -55,101 +55,101 @@ The following hyperparameters were used during training:
|
|
55 |
|
56 |
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Branch | Iou Background | Iou Branch |
|
57 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:---------------:|:--------------:|:----------:|
|
58 |
-
| 0.
|
59 |
-
| 0.
|
60 |
-
| 0.
|
61 |
-
| 0.
|
62 |
-
| 0.
|
63 |
-
| 0.
|
64 |
-
| 0.
|
65 |
-
| 0.
|
66 |
-
| 0.
|
67 |
-
| 0.
|
68 |
-
| 0.
|
69 |
-
| 0.
|
70 |
-
| 0.
|
71 |
-
| 0.
|
72 |
-
| 0.
|
73 |
-
| 0.
|
74 |
-
| 0.
|
75 |
-
| 0.
|
76 |
-
| 0.
|
77 |
-
| 0.
|
78 |
-
| 0.
|
79 |
-
| 0.
|
80 |
-
| 0.
|
81 |
-
| 0.
|
82 |
-
| 0.
|
83 |
-
| 0.
|
84 |
-
| 0.
|
85 |
-
| 0.
|
86 |
-
| 0.
|
87 |
-
| 0.
|
88 |
-
| 0.
|
89 |
-
| 0.
|
90 |
-
| 0.
|
91 |
-
| 0.
|
92 |
-
| 0.
|
93 |
-
| 0.
|
94 |
-
| 0.
|
95 |
-
| 0.
|
96 |
-
| 0.
|
97 |
-
| 0.
|
98 |
-
| 0.
|
99 |
-
| 0.
|
100 |
-
| 0.
|
101 |
-
| 0.
|
102 |
-
| 0.
|
103 |
-
| 0.
|
104 |
-
| 0.
|
105 |
-
| 0.
|
106 |
-
| 0.
|
107 |
-
| 0.
|
108 |
-
| 0.
|
109 |
-
| 0.
|
110 |
-
| 0.
|
111 |
-
| 0.
|
112 |
-
| 0.
|
113 |
-
| 0.
|
114 |
-
| 0.
|
115 |
-
| 0.
|
116 |
-
| 0.
|
117 |
-
| 0.
|
118 |
-
| 0.
|
119 |
-
| 0.
|
120 |
-
| 0.
|
121 |
-
| 0.
|
122 |
-
| 0.
|
123 |
-
| 0.
|
124 |
-
| 0.
|
125 |
-
| 0.
|
126 |
-
| 0.
|
127 |
-
| 0.
|
128 |
-
| 0.
|
129 |
-
| 0.
|
130 |
-
| 0.
|
131 |
-
| 0.
|
132 |
-
| 0.
|
133 |
-
| 0.
|
134 |
-
| 0.
|
135 |
-
| 0.
|
136 |
-
| 0.
|
137 |
-
| 0.
|
138 |
-
| 0.
|
139 |
-
| 0.
|
140 |
-
| 0.
|
141 |
-
| 0.
|
142 |
-
| 0.
|
143 |
-
| 0.
|
144 |
-
| 0.
|
145 |
-
| 0.
|
146 |
-
| 0.
|
147 |
-
| 0.
|
148 |
-
| 0.
|
149 |
-
| 0.
|
150 |
-
| 0.
|
151 |
-
| 0.
|
152 |
-
| 0.
|
153 |
|
154 |
|
155 |
### Framework versions
|
|
|
17 |
|
18 |
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the JCAI2000/100By100BranchPNG dataset.
|
19 |
It achieves the following results on the evaluation set:
|
20 |
+
- Loss: 0.1491
|
21 |
+
- Mean Iou: 0.4520
|
22 |
+
- Mean Accuracy: 0.9041
|
23 |
+
- Overall Accuracy: 0.9041
|
24 |
+
- Accuracy Background: nan
|
25 |
+
- Accuracy Branch: 0.9041
|
26 |
+
- Iou Background: 0.0
|
27 |
+
- Iou Branch: 0.9041
|
28 |
|
29 |
## Model description
|
30 |
|
|
|
55 |
|
56 |
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Branch | Iou Background | Iou Branch |
|
57 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:---------------:|:--------------:|:----------:|
|
58 |
+
| 0.5951 | 1.05 | 20 | 0.6067 | 0.4897 | 0.9793 | 0.9793 | nan | 0.9793 | 0.0 | 0.9793 |
|
59 |
+
| 0.4531 | 2.11 | 40 | 0.4117 | 0.4717 | 0.9435 | 0.9435 | nan | 0.9435 | 0.0 | 0.9435 |
|
60 |
+
| 0.4396 | 3.16 | 60 | 0.3235 | 0.4750 | 0.9499 | 0.9499 | nan | 0.9499 | 0.0 | 0.9499 |
|
61 |
+
| 0.2327 | 4.21 | 80 | 0.2603 | 0.4703 | 0.9405 | 0.9405 | nan | 0.9405 | 0.0 | 0.9405 |
|
62 |
+
| 0.1971 | 5.26 | 100 | 0.2336 | 0.4786 | 0.9572 | 0.9572 | nan | 0.9572 | 0.0 | 0.9572 |
|
63 |
+
| 0.1768 | 6.32 | 120 | 0.2441 | 0.4776 | 0.9552 | 0.9552 | nan | 0.9552 | 0.0 | 0.9552 |
|
64 |
+
| 0.2005 | 7.37 | 140 | 0.1811 | 0.4597 | 0.9194 | 0.9194 | nan | 0.9194 | 0.0 | 0.9194 |
|
65 |
+
| 0.1818 | 8.42 | 160 | 0.2437 | 0.4805 | 0.9610 | 0.9610 | nan | 0.9610 | 0.0 | 0.9610 |
|
66 |
+
| 0.2163 | 9.47 | 180 | 0.1803 | 0.4774 | 0.9548 | 0.9548 | nan | 0.9548 | 0.0 | 0.9548 |
|
67 |
+
| 0.1436 | 10.53 | 200 | 0.1897 | 0.4807 | 0.9615 | 0.9615 | nan | 0.9615 | 0.0 | 0.9615 |
|
68 |
+
| 0.0957 | 11.58 | 220 | 0.1682 | 0.4578 | 0.9157 | 0.9157 | nan | 0.9157 | 0.0 | 0.9157 |
|
69 |
+
| 0.1614 | 12.63 | 240 | 0.1603 | 0.4422 | 0.8844 | 0.8844 | nan | 0.8844 | 0.0 | 0.8844 |
|
70 |
+
| 0.0952 | 13.68 | 260 | 0.1732 | 0.4738 | 0.9477 | 0.9477 | nan | 0.9477 | 0.0 | 0.9477 |
|
71 |
+
| 0.1243 | 14.74 | 280 | 0.1432 | 0.4569 | 0.9139 | 0.9139 | nan | 0.9139 | 0.0 | 0.9139 |
|
72 |
+
| 0.0943 | 15.79 | 300 | 0.1539 | 0.4725 | 0.9451 | 0.9451 | nan | 0.9451 | 0.0 | 0.9451 |
|
73 |
+
| 0.0833 | 16.84 | 320 | 0.1176 | 0.4570 | 0.9140 | 0.9140 | nan | 0.9140 | 0.0 | 0.9140 |
|
74 |
+
| 0.1817 | 17.89 | 340 | 0.1270 | 0.4623 | 0.9246 | 0.9246 | nan | 0.9246 | 0.0 | 0.9246 |
|
75 |
+
| 0.0939 | 18.95 | 360 | 0.1561 | 0.4715 | 0.9431 | 0.9431 | nan | 0.9431 | 0.0 | 0.9431 |
|
76 |
+
| 0.0849 | 20.0 | 380 | 0.1496 | 0.4682 | 0.9363 | 0.9363 | nan | 0.9363 | 0.0 | 0.9363 |
|
77 |
+
| 0.1155 | 21.05 | 400 | 0.1204 | 0.4547 | 0.9094 | 0.9094 | nan | 0.9094 | 0.0 | 0.9094 |
|
78 |
+
| 0.0507 | 22.11 | 420 | 0.1323 | 0.4667 | 0.9335 | 0.9335 | nan | 0.9335 | 0.0 | 0.9335 |
|
79 |
+
| 0.0631 | 23.16 | 440 | 0.1219 | 0.4593 | 0.9185 | 0.9185 | nan | 0.9185 | 0.0 | 0.9185 |
|
80 |
+
| 0.0509 | 24.21 | 460 | 0.1178 | 0.4673 | 0.9346 | 0.9346 | nan | 0.9346 | 0.0 | 0.9346 |
|
81 |
+
| 0.0594 | 25.26 | 480 | 0.1193 | 0.4568 | 0.9136 | 0.9136 | nan | 0.9136 | 0.0 | 0.9136 |
|
82 |
+
| 0.0633 | 26.32 | 500 | 0.1321 | 0.4717 | 0.9434 | 0.9434 | nan | 0.9434 | 0.0 | 0.9434 |
|
83 |
+
| 0.0739 | 27.37 | 520 | 0.1361 | 0.4587 | 0.9174 | 0.9174 | nan | 0.9174 | 0.0 | 0.9174 |
|
84 |
+
| 0.0894 | 28.42 | 540 | 0.1286 | 0.4658 | 0.9317 | 0.9317 | nan | 0.9317 | 0.0 | 0.9317 |
|
85 |
+
| 0.0528 | 29.47 | 560 | 0.1296 | 0.4644 | 0.9288 | 0.9288 | nan | 0.9288 | 0.0 | 0.9288 |
|
86 |
+
| 0.0683 | 30.53 | 580 | 0.1434 | 0.4705 | 0.9410 | 0.9410 | nan | 0.9410 | 0.0 | 0.9410 |
|
87 |
+
| 0.0343 | 31.58 | 600 | 0.1154 | 0.4598 | 0.9196 | 0.9196 | nan | 0.9196 | 0.0 | 0.9196 |
|
88 |
+
| 0.0436 | 32.63 | 620 | 0.1417 | 0.4527 | 0.9053 | 0.9053 | nan | 0.9053 | 0.0 | 0.9053 |
|
89 |
+
| 0.0369 | 33.68 | 640 | 0.1185 | 0.4365 | 0.8730 | 0.8730 | nan | 0.8730 | 0.0 | 0.8730 |
|
90 |
+
| 0.0537 | 34.74 | 660 | 0.1369 | 0.4660 | 0.9319 | 0.9319 | nan | 0.9319 | 0.0 | 0.9319 |
|
91 |
+
| 0.0642 | 35.79 | 680 | 0.1351 | 0.4514 | 0.9027 | 0.9027 | nan | 0.9027 | 0.0 | 0.9027 |
|
92 |
+
| 0.0597 | 36.84 | 700 | 0.1441 | 0.4590 | 0.9180 | 0.9180 | nan | 0.9180 | 0.0 | 0.9180 |
|
93 |
+
| 0.0382 | 37.89 | 720 | 0.1413 | 0.4568 | 0.9136 | 0.9136 | nan | 0.9136 | 0.0 | 0.9136 |
|
94 |
+
| 0.0488 | 38.95 | 740 | 0.1369 | 0.4626 | 0.9252 | 0.9252 | nan | 0.9252 | 0.0 | 0.9252 |
|
95 |
+
| 0.0652 | 40.0 | 760 | 0.1477 | 0.4653 | 0.9306 | 0.9306 | nan | 0.9306 | 0.0 | 0.9306 |
|
96 |
+
| 0.0376 | 41.05 | 780 | 0.1320 | 0.4579 | 0.9158 | 0.9158 | nan | 0.9158 | 0.0 | 0.9158 |
|
97 |
+
| 0.0387 | 42.11 | 800 | 0.1298 | 0.4536 | 0.9071 | 0.9071 | nan | 0.9071 | 0.0 | 0.9071 |
|
98 |
+
| 0.0791 | 43.16 | 820 | 0.1431 | 0.4498 | 0.8997 | 0.8997 | nan | 0.8997 | 0.0 | 0.8997 |
|
99 |
+
| 0.0304 | 44.21 | 840 | 0.1368 | 0.4426 | 0.8852 | 0.8852 | nan | 0.8852 | 0.0 | 0.8852 |
|
100 |
+
| 0.0301 | 45.26 | 860 | 0.1523 | 0.4681 | 0.9363 | 0.9363 | nan | 0.9363 | 0.0 | 0.9363 |
|
101 |
+
| 0.0743 | 46.32 | 880 | 0.1396 | 0.4505 | 0.9009 | 0.9009 | nan | 0.9009 | 0.0 | 0.9009 |
|
102 |
+
| 0.1028 | 47.37 | 900 | 0.1354 | 0.4463 | 0.8926 | 0.8926 | nan | 0.8926 | 0.0 | 0.8926 |
|
103 |
+
| 0.0399 | 48.42 | 920 | 0.1497 | 0.4568 | 0.9136 | 0.9136 | nan | 0.9136 | 0.0 | 0.9136 |
|
104 |
+
| 0.0282 | 49.47 | 940 | 0.1489 | 0.4672 | 0.9343 | 0.9343 | nan | 0.9343 | 0.0 | 0.9343 |
|
105 |
+
| 0.0266 | 50.53 | 960 | 0.1574 | 0.4564 | 0.9128 | 0.9128 | nan | 0.9128 | 0.0 | 0.9128 |
|
106 |
+
| 0.0328 | 51.58 | 980 | 0.1540 | 0.4536 | 0.9072 | 0.9072 | nan | 0.9072 | 0.0 | 0.9072 |
|
107 |
+
| 0.0273 | 52.63 | 1000 | 0.1624 | 0.4572 | 0.9144 | 0.9144 | nan | 0.9144 | 0.0 | 0.9144 |
|
108 |
+
| 0.0311 | 53.68 | 1020 | 0.1459 | 0.4386 | 0.8771 | 0.8771 | nan | 0.8771 | 0.0 | 0.8771 |
|
109 |
+
| 0.0481 | 54.74 | 1040 | 0.1607 | 0.4597 | 0.9194 | 0.9194 | nan | 0.9194 | 0.0 | 0.9194 |
|
110 |
+
| 0.0384 | 55.79 | 1060 | 0.1718 | 0.4596 | 0.9192 | 0.9192 | nan | 0.9192 | 0.0 | 0.9192 |
|
111 |
+
| 0.0299 | 56.84 | 1080 | 0.1708 | 0.4589 | 0.9178 | 0.9178 | nan | 0.9178 | 0.0 | 0.9178 |
|
112 |
+
| 0.0315 | 57.89 | 1100 | 0.1458 | 0.4539 | 0.9078 | 0.9078 | nan | 0.9078 | 0.0 | 0.9078 |
|
113 |
+
| 0.2086 | 58.95 | 1120 | 0.1428 | 0.4590 | 0.9181 | 0.9181 | nan | 0.9181 | 0.0 | 0.9181 |
|
114 |
+
| 0.0355 | 60.0 | 1140 | 0.1575 | 0.4478 | 0.8957 | 0.8957 | nan | 0.8957 | 0.0 | 0.8957 |
|
115 |
+
| 0.0236 | 61.05 | 1160 | 0.1610 | 0.4471 | 0.8941 | 0.8941 | nan | 0.8941 | 0.0 | 0.8941 |
|
116 |
+
| 0.0775 | 62.11 | 1180 | 0.1688 | 0.4478 | 0.8955 | 0.8955 | nan | 0.8955 | 0.0 | 0.8955 |
|
117 |
+
| 0.026 | 63.16 | 1200 | 0.1513 | 0.4558 | 0.9117 | 0.9117 | nan | 0.9117 | 0.0 | 0.9117 |
|
118 |
+
| 0.03 | 64.21 | 1220 | 0.1583 | 0.4630 | 0.9260 | 0.9260 | nan | 0.9260 | 0.0 | 0.9260 |
|
119 |
+
| 0.0255 | 65.26 | 1240 | 0.1595 | 0.4565 | 0.9131 | 0.9131 | nan | 0.9131 | 0.0 | 0.9131 |
|
120 |
+
| 0.079 | 66.32 | 1260 | 0.1485 | 0.4503 | 0.9005 | 0.9005 | nan | 0.9005 | 0.0 | 0.9005 |
|
121 |
+
| 0.0366 | 67.37 | 1280 | 0.1658 | 0.4561 | 0.9123 | 0.9123 | nan | 0.9123 | 0.0 | 0.9123 |
|
122 |
+
| 0.0286 | 68.42 | 1300 | 0.1890 | 0.4667 | 0.9334 | 0.9334 | nan | 0.9334 | 0.0 | 0.9334 |
|
123 |
+
| 0.0303 | 69.47 | 1320 | 0.1469 | 0.4526 | 0.9052 | 0.9052 | nan | 0.9052 | 0.0 | 0.9052 |
|
124 |
+
| 0.0215 | 70.53 | 1340 | 0.1559 | 0.4548 | 0.9095 | 0.9095 | nan | 0.9095 | 0.0 | 0.9095 |
|
125 |
+
| 0.028 | 71.58 | 1360 | 0.1616 | 0.4598 | 0.9195 | 0.9195 | nan | 0.9195 | 0.0 | 0.9195 |
|
126 |
+
| 0.0228 | 72.63 | 1380 | 0.1445 | 0.4521 | 0.9041 | 0.9041 | nan | 0.9041 | 0.0 | 0.9041 |
|
127 |
+
| 0.0216 | 73.68 | 1400 | 0.1526 | 0.4542 | 0.9085 | 0.9085 | nan | 0.9085 | 0.0 | 0.9085 |
|
128 |
+
| 0.0202 | 74.74 | 1420 | 0.1525 | 0.4643 | 0.9285 | 0.9285 | nan | 0.9285 | 0.0 | 0.9285 |
|
129 |
+
| 0.0297 | 75.79 | 1440 | 0.1471 | 0.4590 | 0.9180 | 0.9180 | nan | 0.9180 | 0.0 | 0.9180 |
|
130 |
+
| 0.0237 | 76.84 | 1460 | 0.1603 | 0.4604 | 0.9208 | 0.9208 | nan | 0.9208 | 0.0 | 0.9208 |
|
131 |
+
| 0.0601 | 77.89 | 1480 | 0.1526 | 0.4581 | 0.9161 | 0.9161 | nan | 0.9161 | 0.0 | 0.9161 |
|
132 |
+
| 0.0299 | 78.95 | 1500 | 0.1625 | 0.4579 | 0.9159 | 0.9159 | nan | 0.9159 | 0.0 | 0.9159 |
|
133 |
+
| 0.0316 | 80.0 | 1520 | 0.1702 | 0.4593 | 0.9185 | 0.9185 | nan | 0.9185 | 0.0 | 0.9185 |
|
134 |
+
| 0.0274 | 81.05 | 1540 | 0.1741 | 0.4607 | 0.9214 | 0.9214 | nan | 0.9214 | 0.0 | 0.9214 |
|
135 |
+
| 0.0274 | 82.11 | 1560 | 0.1609 | 0.4594 | 0.9188 | 0.9188 | nan | 0.9188 | 0.0 | 0.9188 |
|
136 |
+
| 0.0345 | 83.16 | 1580 | 0.1652 | 0.4581 | 0.9163 | 0.9163 | nan | 0.9163 | 0.0 | 0.9163 |
|
137 |
+
| 0.018 | 84.21 | 1600 | 0.1645 | 0.4588 | 0.9176 | 0.9176 | nan | 0.9176 | 0.0 | 0.9176 |
|
138 |
+
| 0.0352 | 85.26 | 1620 | 0.1579 | 0.4588 | 0.9176 | 0.9176 | nan | 0.9176 | 0.0 | 0.9176 |
|
139 |
+
| 0.0202 | 86.32 | 1640 | 0.1741 | 0.4620 | 0.9239 | 0.9239 | nan | 0.9239 | 0.0 | 0.9239 |
|
140 |
+
| 0.0315 | 87.37 | 1660 | 0.1587 | 0.4543 | 0.9086 | 0.9086 | nan | 0.9086 | 0.0 | 0.9086 |
|
141 |
+
| 0.0208 | 88.42 | 1680 | 0.1610 | 0.4579 | 0.9158 | 0.9158 | nan | 0.9158 | 0.0 | 0.9158 |
|
142 |
+
| 0.0174 | 89.47 | 1700 | 0.1685 | 0.4596 | 0.9193 | 0.9193 | nan | 0.9193 | 0.0 | 0.9193 |
|
143 |
+
| 0.0278 | 90.53 | 1720 | 0.1698 | 0.4586 | 0.9173 | 0.9173 | nan | 0.9173 | 0.0 | 0.9173 |
|
144 |
+
| 0.0259 | 91.58 | 1740 | 0.1674 | 0.4593 | 0.9186 | 0.9186 | nan | 0.9186 | 0.0 | 0.9186 |
|
145 |
+
| 0.0642 | 92.63 | 1760 | 0.1586 | 0.4572 | 0.9144 | 0.9144 | nan | 0.9144 | 0.0 | 0.9144 |
|
146 |
+
| 0.0543 | 93.68 | 1780 | 0.1636 | 0.4591 | 0.9182 | 0.9182 | nan | 0.9182 | 0.0 | 0.9182 |
|
147 |
+
| 0.0264 | 94.74 | 1800 | 0.1572 | 0.4586 | 0.9172 | 0.9172 | nan | 0.9172 | 0.0 | 0.9172 |
|
148 |
+
| 0.0239 | 95.79 | 1820 | 0.1687 | 0.4596 | 0.9191 | 0.9191 | nan | 0.9191 | 0.0 | 0.9191 |
|
149 |
+
| 0.0238 | 96.84 | 1840 | 0.1595 | 0.4569 | 0.9137 | 0.9137 | nan | 0.9137 | 0.0 | 0.9137 |
|
150 |
+
| 0.0181 | 97.89 | 1860 | 0.1552 | 0.4552 | 0.9103 | 0.9103 | nan | 0.9103 | 0.0 | 0.9103 |
|
151 |
+
| 0.0354 | 98.95 | 1880 | 0.1645 | 0.4573 | 0.9146 | 0.9146 | nan | 0.9146 | 0.0 | 0.9146 |
|
152 |
+
| 0.0897 | 100.0 | 1900 | 0.1491 | 0.4520 | 0.9041 | 0.9041 | nan | 0.9041 | 0.0 | 0.9041 |
|
153 |
|
154 |
|
155 |
### Framework versions
|
pytorch_model.bin
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 14931789
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:69179b0af81b60a1808894456bc4626949fd27c9569a8341f89b577233002386
|
3 |
size 14931789
|
training_args.bin
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:2ca17834cf4c990ee19377b9e898983eb29a7eb06a7818179481cdd1a8e9b305
|
3 |
+
size 4219
|