End of training
Browse files
README.md
CHANGED
@@ -16,23 +16,23 @@ should probably proofread and complete it, then remove this comment. -->
|
|
16 |
|
17 |
This model is a fine-tuned version of [PekingU/rtdetr_v2_r50vd](https://huggingface.co/PekingU/rtdetr_v2_r50vd) on an unknown dataset.
|
18 |
It achieves the following results on the evaluation set:
|
19 |
-
- Loss:
|
20 |
-
- Map: 0.
|
21 |
-
- Map 50: 0.
|
22 |
-
- Map 75: 0.
|
23 |
-
- Map Small: 0.
|
24 |
-
- Map Medium: 0.
|
25 |
-
- Map Large: 0.
|
26 |
-
- Mar 1: 0.
|
27 |
-
- Mar 10: 0.
|
28 |
-
- Mar 100: 0.
|
29 |
-
- Mar Small: 0.
|
30 |
-
- Mar Medium: 0.
|
31 |
-
- Mar Large: 0.
|
32 |
-
- Map Football: 0.
|
33 |
-
- Mar 100 Football: 0.
|
34 |
-
- Map Player: 0.
|
35 |
-
- Mar 100 Player: 0.
|
36 |
|
37 |
## Model description
|
38 |
|
@@ -51,59 +51,69 @@ More information needed
|
|
51 |
### Training hyperparameters
|
52 |
|
53 |
The following hyperparameters were used during training:
|
54 |
-
- learning_rate:
|
55 |
-
- train_batch_size:
|
56 |
- eval_batch_size: 8
|
57 |
- seed: 42
|
58 |
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
59 |
- lr_scheduler_type: linear
|
60 |
- lr_scheduler_warmup_steps: 300
|
61 |
-
- num_epochs:
|
62 |
|
63 |
### Training results
|
64 |
|
65 |
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Football | Mar 100 Football | Map Player | Mar 100 Player |
|
66 |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:----------:|:--------------:|
|
67 |
-
| No log | 1.0 |
|
68 |
-
| No log | 2.0 |
|
69 |
-
| No log | 3.0 |
|
70 |
-
| No log | 4.0 |
|
71 |
-
|
|
72 |
-
|
|
73 |
-
|
|
74 |
-
|
|
75 |
-
|
|
76 |
-
|
|
77 |
-
|
|
78 |
-
|
|
79 |
-
|
|
80 |
-
|
|
81 |
-
|
|
82 |
-
|
|
83 |
-
|
|
84 |
-
|
|
85 |
-
|
|
86 |
-
|
|
87 |
-
|
|
88 |
-
|
|
89 |
-
|
|
90 |
-
|
|
91 |
-
| 10.
|
92 |
-
| 10.
|
93 |
-
| 10.
|
94 |
-
| 10.
|
95 |
-
| 10.
|
96 |
-
| 10.
|
97 |
-
| 10.
|
98 |
-
| 10.
|
99 |
-
| 10.
|
100 |
-
| 10.
|
101 |
-
| 10.
|
102 |
-
| 10.
|
103 |
-
|
|
104 |
-
|
|
105 |
-
|
|
106 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
107 |
|
108 |
|
109 |
### Framework versions
|
|
|
16 |
|
17 |
This model is a fine-tuned version of [PekingU/rtdetr_v2_r50vd](https://huggingface.co/PekingU/rtdetr_v2_r50vd) on an unknown dataset.
|
18 |
It achieves the following results on the evaluation set:
|
19 |
+
- Loss: 11.5095
|
20 |
+
- Map: 0.3725
|
21 |
+
- Map 50: 0.7476
|
22 |
+
- Map 75: 0.2714
|
23 |
+
- Map Small: 0.3321
|
24 |
+
- Map Medium: 0.6483
|
25 |
+
- Map Large: 0.6592
|
26 |
+
- Mar 1: 0.2659
|
27 |
+
- Mar 10: 0.481
|
28 |
+
- Mar 100: 0.6106
|
29 |
+
- Mar Small: 0.5823
|
30 |
+
- Mar Medium: 0.7862
|
31 |
+
- Mar Large: 0.82
|
32 |
+
- Map Football: 0.3851
|
33 |
+
- Mar 100 Football: 0.6267
|
34 |
+
- Map Player: 0.3599
|
35 |
+
- Mar 100 Player: 0.5945
|
36 |
|
37 |
## Model description
|
38 |
|
|
|
51 |
### Training hyperparameters
|
52 |
|
53 |
The following hyperparameters were used during training:
|
54 |
+
- learning_rate: 0.0001
|
55 |
+
- train_batch_size: 16
|
56 |
- eval_batch_size: 8
|
57 |
- seed: 42
|
58 |
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
59 |
- lr_scheduler_type: linear
|
60 |
- lr_scheduler_warmup_steps: 300
|
61 |
+
- num_epochs: 50
|
62 |
|
63 |
### Training results
|
64 |
|
65 |
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Football | Mar 100 Football | Map Player | Mar 100 Player |
|
66 |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:----------:|:--------------:|
|
67 |
+
| No log | 1.0 | 62 | 25.6132 | 0.0546 | 0.1302 | 0.0333 | 0.0485 | 0.084 | 0.1754 | 0.0255 | 0.1135 | 0.234 | 0.2005 | 0.425 | 0.4976 | 0.0009 | 0.1438 | 0.1083 | 0.3243 |
|
68 |
+
| No log | 2.0 | 124 | 13.9469 | 0.2201 | 0.4933 | 0.1457 | 0.1959 | 0.3985 | 0.505 | 0.1262 | 0.2926 | 0.3963 | 0.3546 | 0.6417 | 0.7659 | 0.119 | 0.2485 | 0.3212 | 0.5442 |
|
69 |
+
| No log | 3.0 | 186 | 10.2589 | 0.2696 | 0.6196 | 0.1867 | 0.2462 | 0.5103 | 0.5763 | 0.1841 | 0.3916 | 0.491 | 0.4515 | 0.7095 | 0.8 | 0.1708 | 0.4083 | 0.3684 | 0.5737 |
|
70 |
+
| No log | 4.0 | 248 | 10.3786 | 0.3012 | 0.6683 | 0.2103 | 0.2759 | 0.5272 | 0.5992 | 0.1997 | 0.4191 | 0.5298 | 0.4949 | 0.7284 | 0.7927 | 0.2053 | 0.468 | 0.3971 | 0.5916 |
|
71 |
+
| No log | 5.0 | 310 | 10.6640 | 0.3155 | 0.7009 | 0.2387 | 0.2878 | 0.5683 | 0.5722 | 0.2247 | 0.4334 | 0.5462 | 0.513 | 0.7302 | 0.8073 | 0.2306 | 0.4941 | 0.4004 | 0.5983 |
|
72 |
+
| No log | 6.0 | 372 | 11.0625 | 0.3232 | 0.7144 | 0.2294 | 0.2924 | 0.5412 | 0.5627 | 0.2209 | 0.4349 | 0.551 | 0.5211 | 0.7164 | 0.7902 | 0.2607 | 0.5053 | 0.3857 | 0.5967 |
|
73 |
+
| No log | 7.0 | 434 | 11.3822 | 0.3232 | 0.7126 | 0.2345 | 0.29 | 0.5672 | 0.569 | 0.2265 | 0.4264 | 0.554 | 0.5254 | 0.7103 | 0.7976 | 0.2758 | 0.5036 | 0.3705 | 0.6045 |
|
74 |
+
| No log | 8.0 | 496 | 11.2556 | 0.3431 | 0.7365 | 0.2561 | 0.2998 | 0.587 | 0.5728 | 0.2356 | 0.4421 | 0.5637 | 0.5329 | 0.7277 | 0.8024 | 0.3078 | 0.5331 | 0.3784 | 0.5942 |
|
75 |
+
| 21.6364 | 9.0 | 558 | 11.5124 | 0.3223 | 0.7078 | 0.2487 | 0.2756 | 0.6108 | 0.5958 | 0.2191 | 0.4119 | 0.527 | 0.486 | 0.7515 | 0.7927 | 0.3017 | 0.4876 | 0.3429 | 0.5664 |
|
76 |
+
| 21.6364 | 10.0 | 620 | 11.5765 | 0.3243 | 0.7218 | 0.2531 | 0.2848 | 0.5856 | 0.5537 | 0.2096 | 0.4239 | 0.5538 | 0.52 | 0.7405 | 0.8024 | 0.2894 | 0.5083 | 0.3592 | 0.5993 |
|
77 |
+
| 21.6364 | 11.0 | 682 | 11.6208 | 0.3161 | 0.6892 | 0.2438 | 0.2783 | 0.5376 | 0.568 | 0.2107 | 0.4317 | 0.5507 | 0.5183 | 0.7182 | 0.7756 | 0.2791 | 0.5308 | 0.3531 | 0.5706 |
|
78 |
+
| 21.6364 | 12.0 | 744 | 11.6912 | 0.3331 | 0.7165 | 0.2691 | 0.2928 | 0.5605 | 0.5375 | 0.227 | 0.4203 | 0.5458 | 0.5098 | 0.7412 | 0.8049 | 0.3268 | 0.5136 | 0.3395 | 0.578 |
|
79 |
+
| 21.6364 | 13.0 | 806 | 11.6316 | 0.3435 | 0.7097 | 0.2818 | 0.2977 | 0.5764 | 0.5981 | 0.2363 | 0.4349 | 0.5687 | 0.5343 | 0.7552 | 0.7707 | 0.3373 | 0.5462 | 0.3497 | 0.5913 |
|
80 |
+
| 21.6364 | 14.0 | 868 | 11.6562 | 0.3275 | 0.674 | 0.2703 | 0.2791 | 0.5807 | 0.5877 | 0.2199 | 0.4398 | 0.563 | 0.5289 | 0.7474 | 0.7902 | 0.2999 | 0.5343 | 0.3552 | 0.5917 |
|
81 |
+
| 21.6364 | 15.0 | 930 | 11.6738 | 0.3358 | 0.697 | 0.2684 | 0.2872 | 0.576 | 0.5778 | 0.2211 | 0.4417 | 0.5635 | 0.5291 | 0.7486 | 0.7829 | 0.3218 | 0.5456 | 0.3497 | 0.5814 |
|
82 |
+
| 21.6364 | 16.0 | 992 | 11.8403 | 0.2999 | 0.6345 | 0.2358 | 0.2609 | 0.5112 | 0.5219 | 0.2097 | 0.4335 | 0.5602 | 0.5281 | 0.7315 | 0.7805 | 0.2772 | 0.5408 | 0.3226 | 0.5795 |
|
83 |
+
| 11.368 | 17.0 | 1054 | 11.7290 | 0.3296 | 0.6726 | 0.2699 | 0.2777 | 0.5895 | 0.6073 | 0.227 | 0.4414 | 0.5627 | 0.5298 | 0.7418 | 0.7927 | 0.2914 | 0.5278 | 0.3678 | 0.5976 |
|
84 |
+
| 11.368 | 18.0 | 1116 | 11.8573 | 0.3028 | 0.6623 | 0.2248 | 0.2541 | 0.5623 | 0.5767 | 0.2073 | 0.4103 | 0.5304 | 0.4965 | 0.7124 | 0.7756 | 0.2836 | 0.5024 | 0.3221 | 0.5585 |
|
85 |
+
| 11.368 | 19.0 | 1178 | 11.9871 | 0.3219 | 0.7084 | 0.2622 | 0.269 | 0.5608 | 0.5603 | 0.2207 | 0.4094 | 0.5285 | 0.4913 | 0.7253 | 0.7659 | 0.3354 | 0.5101 | 0.3085 | 0.547 |
|
86 |
+
| 11.368 | 20.0 | 1240 | 12.0159 | 0.3183 | 0.6629 | 0.2521 | 0.2709 | 0.5668 | 0.5531 | 0.2178 | 0.4353 | 0.5638 | 0.534 | 0.7192 | 0.7805 | 0.285 | 0.5391 | 0.3515 | 0.5886 |
|
87 |
+
| 11.368 | 21.0 | 1302 | 11.9282 | 0.3171 | 0.6604 | 0.2588 | 0.2726 | 0.5511 | 0.5578 | 0.2207 | 0.4357 | 0.5551 | 0.5202 | 0.7374 | 0.7976 | 0.2945 | 0.5432 | 0.3397 | 0.5669 |
|
88 |
+
| 11.368 | 22.0 | 1364 | 12.0976 | 0.3067 | 0.6361 | 0.2538 | 0.2645 | 0.5511 | 0.506 | 0.2238 | 0.4239 | 0.5548 | 0.5229 | 0.727 | 0.7756 | 0.3059 | 0.5296 | 0.3075 | 0.58 |
|
89 |
+
| 11.368 | 23.0 | 1426 | 12.0878 | 0.295 | 0.6238 | 0.2333 | 0.2549 | 0.5278 | 0.5621 | 0.2235 | 0.4199 | 0.5487 | 0.5167 | 0.7139 | 0.7902 | 0.2787 | 0.526 | 0.3114 | 0.5713 |
|
90 |
+
| 11.368 | 24.0 | 1488 | 12.0822 | 0.2902 | 0.5951 | 0.2418 | 0.2462 | 0.5097 | 0.5734 | 0.2195 | 0.4262 | 0.5529 | 0.5196 | 0.7248 | 0.8098 | 0.2626 | 0.5379 | 0.3177 | 0.5678 |
|
91 |
+
| 10.7747 | 25.0 | 1550 | 11.9736 | 0.2901 | 0.6026 | 0.2415 | 0.2425 | 0.5532 | 0.535 | 0.2082 | 0.4219 | 0.5471 | 0.5151 | 0.714 | 0.7951 | 0.2476 | 0.5225 | 0.3325 | 0.5717 |
|
92 |
+
| 10.7747 | 26.0 | 1612 | 12.0572 | 0.3161 | 0.6534 | 0.2586 | 0.2715 | 0.5352 | 0.5768 | 0.2357 | 0.4308 | 0.5546 | 0.5232 | 0.7222 | 0.7805 | 0.2975 | 0.5343 | 0.3348 | 0.5749 |
|
93 |
+
| 10.7747 | 27.0 | 1674 | 12.3326 | 0.3123 | 0.6555 | 0.2599 | 0.2673 | 0.5759 | 0.5227 | 0.2378 | 0.4056 | 0.5396 | 0.5028 | 0.7365 | 0.7927 | 0.3364 | 0.5201 | 0.2881 | 0.5592 |
|
94 |
+
| 10.7747 | 28.0 | 1736 | 12.2543 | 0.2969 | 0.6186 | 0.2319 | 0.2651 | 0.4847 | 0.5503 | 0.2342 | 0.425 | 0.548 | 0.5125 | 0.7349 | 0.7902 | 0.277 | 0.5302 | 0.3168 | 0.5657 |
|
95 |
+
| 10.7747 | 29.0 | 1798 | 12.2803 | 0.277 | 0.593 | 0.2105 | 0.2472 | 0.4377 | 0.5943 | 0.2222 | 0.4072 | 0.5317 | 0.4964 | 0.7187 | 0.7902 | 0.2515 | 0.5071 | 0.3026 | 0.5564 |
|
96 |
+
| 10.7747 | 30.0 | 1860 | 12.1818 | 0.2775 | 0.5813 | 0.2328 | 0.2379 | 0.473 | 0.5608 | 0.2116 | 0.4026 | 0.537 | 0.5032 | 0.708 | 0.7854 | 0.2745 | 0.5225 | 0.2805 | 0.5515 |
|
97 |
+
| 10.7747 | 31.0 | 1922 | 12.2820 | 0.2968 | 0.6143 | 0.2584 | 0.2605 | 0.4982 | 0.5771 | 0.2278 | 0.4028 | 0.5432 | 0.5079 | 0.7294 | 0.7927 | 0.2969 | 0.516 | 0.2968 | 0.5703 |
|
98 |
+
| 10.7747 | 32.0 | 1984 | 12.2088 | 0.3081 | 0.6381 | 0.2531 | 0.265 | 0.5455 | 0.5901 | 0.2288 | 0.4148 | 0.5522 | 0.5185 | 0.7315 | 0.8 | 0.3116 | 0.5284 | 0.3046 | 0.576 |
|
99 |
+
| 10.3164 | 33.0 | 2046 | 12.2333 | 0.32 | 0.6579 | 0.2732 | 0.2755 | 0.5699 | 0.611 | 0.2241 | 0.4213 | 0.5495 | 0.5152 | 0.7339 | 0.7951 | 0.3068 | 0.516 | 0.3332 | 0.5831 |
|
100 |
+
| 10.3164 | 34.0 | 2108 | 12.2837 | 0.3101 | 0.6443 | 0.2416 | 0.2701 | 0.5166 | 0.608 | 0.2273 | 0.4086 | 0.5418 | 0.5096 | 0.7066 | 0.7878 | 0.3157 | 0.516 | 0.3046 | 0.5676 |
|
101 |
+
| 10.3164 | 35.0 | 2170 | 12.2337 | 0.3134 | 0.6521 | 0.2608 | 0.2682 | 0.5608 | 0.5789 | 0.2282 | 0.4106 | 0.5432 | 0.5073 | 0.7375 | 0.7829 | 0.3105 | 0.5112 | 0.3163 | 0.5753 |
|
102 |
+
| 10.3164 | 36.0 | 2232 | 12.3075 | 0.3253 | 0.6673 | 0.2636 | 0.2784 | 0.565 | 0.6006 | 0.219 | 0.426 | 0.5509 | 0.5194 | 0.7148 | 0.7927 | 0.3282 | 0.5272 | 0.3223 | 0.5745 |
|
103 |
+
| 10.3164 | 37.0 | 2294 | 12.2818 | 0.3148 | 0.657 | 0.2662 | 0.2774 | 0.4955 | 0.6003 | 0.2284 | 0.414 | 0.5386 | 0.5043 | 0.7227 | 0.7927 | 0.3008 | 0.5047 | 0.3287 | 0.5725 |
|
104 |
+
| 10.3164 | 38.0 | 2356 | 12.2240 | 0.3242 | 0.6712 | 0.263 | 0.2859 | 0.5314 | 0.5672 | 0.2343 | 0.424 | 0.5537 | 0.5208 | 0.7291 | 0.7829 | 0.3268 | 0.526 | 0.3217 | 0.5813 |
|
105 |
+
| 10.3164 | 39.0 | 2418 | 12.2762 | 0.3339 | 0.6909 | 0.2658 | 0.2898 | 0.5623 | 0.594 | 0.2351 | 0.4128 | 0.5439 | 0.5085 | 0.7336 | 0.7927 | 0.3428 | 0.5059 | 0.3249 | 0.5818 |
|
106 |
+
| 10.3164 | 40.0 | 2480 | 12.3438 | 0.3291 | 0.6852 | 0.273 | 0.2839 | 0.555 | 0.5931 | 0.2367 | 0.4082 | 0.5353 | 0.4994 | 0.7302 | 0.7927 | 0.339 | 0.5006 | 0.3193 | 0.5701 |
|
107 |
+
| 9.8729 | 41.0 | 2542 | 12.3400 | 0.3176 | 0.6576 | 0.2564 | 0.2738 | 0.5406 | 0.5945 | 0.2288 | 0.4075 | 0.5376 | 0.5042 | 0.7166 | 0.7927 | 0.3168 | 0.4964 | 0.3184 | 0.5787 |
|
108 |
+
| 9.8729 | 42.0 | 2604 | 12.4052 | 0.3321 | 0.6917 | 0.278 | 0.2884 | 0.5573 | 0.5958 | 0.2355 | 0.4168 | 0.5496 | 0.5154 | 0.7341 | 0.7854 | 0.3445 | 0.5183 | 0.3198 | 0.5808 |
|
109 |
+
| 9.8729 | 43.0 | 2666 | 12.4183 | 0.3379 | 0.6898 | 0.2921 | 0.2957 | 0.5567 | 0.5721 | 0.2392 | 0.4215 | 0.551 | 0.5172 | 0.7342 | 0.7902 | 0.347 | 0.5172 | 0.3287 | 0.5848 |
|
110 |
+
| 9.8729 | 44.0 | 2728 | 12.4820 | 0.3327 | 0.6809 | 0.2794 | 0.2938 | 0.5411 | 0.5767 | 0.2324 | 0.4187 | 0.5528 | 0.5185 | 0.7387 | 0.7829 | 0.3358 | 0.5183 | 0.3296 | 0.5873 |
|
111 |
+
| 9.8729 | 45.0 | 2790 | 12.4870 | 0.3299 | 0.673 | 0.2888 | 0.2881 | 0.5515 | 0.592 | 0.2339 | 0.41 | 0.5464 | 0.5141 | 0.7214 | 0.7732 | 0.3421 | 0.5047 | 0.3177 | 0.5882 |
|
112 |
+
| 9.8729 | 46.0 | 2852 | 12.4976 | 0.3206 | 0.6515 | 0.2649 | 0.2808 | 0.5288 | 0.5938 | 0.235 | 0.4153 | 0.5486 | 0.5154 | 0.7266 | 0.7951 | 0.3217 | 0.5136 | 0.3196 | 0.5835 |
|
113 |
+
| 9.8729 | 47.0 | 2914 | 12.5671 | 0.3294 | 0.6769 | 0.2767 | 0.2915 | 0.544 | 0.5808 | 0.2366 | 0.4141 | 0.5513 | 0.5182 | 0.7287 | 0.7976 | 0.3451 | 0.5189 | 0.3137 | 0.5836 |
|
114 |
+
| 9.8729 | 48.0 | 2976 | 12.5045 | 0.3271 | 0.6722 | 0.2687 | 0.2846 | 0.5515 | 0.579 | 0.2301 | 0.4127 | 0.5457 | 0.5134 | 0.7189 | 0.7707 | 0.3373 | 0.5101 | 0.3169 | 0.5814 |
|
115 |
+
| 9.4381 | 49.0 | 3038 | 12.5120 | 0.3298 | 0.6843 | 0.2663 | 0.2881 | 0.5545 | 0.5797 | 0.229 | 0.4086 | 0.5444 | 0.5125 | 0.7164 | 0.7732 | 0.3455 | 0.5077 | 0.3141 | 0.5811 |
|
116 |
+
| 9.4381 | 50.0 | 3100 | 12.5120 | 0.3325 | 0.6831 | 0.2654 | 0.2897 | 0.5561 | 0.5867 | 0.2306 | 0.4114 | 0.5449 | 0.5139 | 0.7109 | 0.761 | 0.3486 | 0.5101 | 0.3165 | 0.5798 |
|
117 |
|
118 |
|
119 |
### Framework versions
|
model.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 171535680
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:7e7e65f5838ba8e0c765917849ae8594baeff983cb7d4cf7aca94d8bff406170
|
3 |
size 171535680
|
runs/May18_12-27-22_55a9bb191013/events.out.tfevents.1747571250.55a9bb191013.1616.0
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:9ff8988215435760d681f7a3c32eaef23484bd322a53cd90e7b69b734c310107
|
3 |
+
size 64383
|
runs/May18_12-27-22_55a9bb191013/events.out.tfevents.1747575787.55a9bb191013.1616.1
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:387e3bc1f5be6bade89b40dbd966d16e7bb49731f87581a3392ed1a9059eca53
|
3 |
+
size 1204
|
training_args.bin
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 5777
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:dd80f6cf933c5eec2fac3cfe21b8ef56c5ea6996603c57df01e0bc6a6464685a
|
3 |
size 5777
|