Model save
Browse files- README.md +159 -0
- model.safetensors +1 -1
README.md
ADDED
@@ -0,0 +1,159 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
library_name: transformers
|
3 |
+
license: cc-by-nc-4.0
|
4 |
+
base_model: MCG-NJU/videomae-large-finetuned-kinetics
|
5 |
+
tags:
|
6 |
+
- generated_from_trainer
|
7 |
+
metrics:
|
8 |
+
- accuracy
|
9 |
+
model-index:
|
10 |
+
- name: 5c_2
|
11 |
+
results: []
|
12 |
+
---
|
13 |
+
|
14 |
+
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
15 |
+
should probably proofread and complete it, then remove this comment. -->
|
16 |
+
|
17 |
+
# 5c_2
|
18 |
+
|
19 |
+
This model is a fine-tuned version of [MCG-NJU/videomae-large-finetuned-kinetics](https://huggingface.co/MCG-NJU/videomae-large-finetuned-kinetics) on an unknown dataset.
|
20 |
+
It achieves the following results on the evaluation set:
|
21 |
+
- Loss: 3.8264
|
22 |
+
- Accuracy: 0.44
|
23 |
+
|
24 |
+
## Model description
|
25 |
+
|
26 |
+
More information needed
|
27 |
+
|
28 |
+
## Intended uses & limitations
|
29 |
+
|
30 |
+
More information needed
|
31 |
+
|
32 |
+
## Training and evaluation data
|
33 |
+
|
34 |
+
More information needed
|
35 |
+
|
36 |
+
## Training procedure
|
37 |
+
|
38 |
+
### Training hyperparameters
|
39 |
+
|
40 |
+
The following hyperparameters were used during training:
|
41 |
+
- learning_rate: 1e-05
|
42 |
+
- train_batch_size: 5
|
43 |
+
- eval_batch_size: 5
|
44 |
+
- seed: 42
|
45 |
+
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
46 |
+
- lr_scheduler_type: linear
|
47 |
+
- lr_scheduler_warmup_ratio: 0.1
|
48 |
+
- training_steps: 4600
|
49 |
+
|
50 |
+
### Training results
|
51 |
+
|
52 |
+
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|
53 |
+
|:-------------:|:-------:|:----:|:---------------:|:--------:|
|
54 |
+
| 1.3192 | 0.0102 | 47 | 1.3455 | 0.4 |
|
55 |
+
| 0.8974 | 1.0102 | 94 | 1.4305 | 0.4 |
|
56 |
+
| 0.9209 | 2.0102 | 141 | 1.5682 | 0.4 |
|
57 |
+
| 0.7915 | 3.0102 | 188 | 1.6475 | 0.4 |
|
58 |
+
| 0.9571 | 4.0102 | 235 | 1.4177 | 0.4 |
|
59 |
+
| 0.9085 | 5.0102 | 282 | 1.4976 | 0.4 |
|
60 |
+
| 0.8815 | 6.0102 | 329 | 1.6249 | 0.4 |
|
61 |
+
| 0.9245 | 7.0102 | 376 | 1.7363 | 0.4 |
|
62 |
+
| 0.9179 | 8.0102 | 423 | 1.4075 | 0.4 |
|
63 |
+
| 1.0738 | 9.0102 | 470 | 1.9536 | 0.4 |
|
64 |
+
| 0.5865 | 10.0102 | 517 | 1.6996 | 0.4 |
|
65 |
+
| 0.4752 | 11.0102 | 564 | 2.1711 | 0.4 |
|
66 |
+
| 0.7409 | 12.0102 | 611 | 1.7252 | 0.36 |
|
67 |
+
| 0.7534 | 13.0102 | 658 | 1.9988 | 0.4 |
|
68 |
+
| 0.6109 | 14.0102 | 705 | 1.7449 | 0.36 |
|
69 |
+
| 0.4217 | 15.0102 | 752 | 2.9984 | 0.4 |
|
70 |
+
| 0.8409 | 16.0102 | 799 | 1.6709 | 0.36 |
|
71 |
+
| 0.6114 | 17.0102 | 846 | 1.9014 | 0.36 |
|
72 |
+
| 0.6806 | 18.0102 | 893 | 1.8763 | 0.36 |
|
73 |
+
| 0.5359 | 19.0102 | 940 | 2.1816 | 0.36 |
|
74 |
+
| 0.5989 | 20.0102 | 987 | 1.7239 | 0.32 |
|
75 |
+
| 0.4145 | 21.0102 | 1034 | 2.2460 | 0.32 |
|
76 |
+
| 0.3841 | 22.0102 | 1081 | 2.4509 | 0.36 |
|
77 |
+
| 0.4356 | 23.0102 | 1128 | 2.1125 | 0.36 |
|
78 |
+
| 0.2509 | 24.0102 | 1175 | 2.6513 | 0.32 |
|
79 |
+
| 0.4963 | 25.0102 | 1222 | 2.8019 | 0.4 |
|
80 |
+
| 0.1915 | 26.0102 | 1269 | 2.4637 | 0.32 |
|
81 |
+
| 0.1269 | 27.0102 | 1316 | 2.8957 | 0.36 |
|
82 |
+
| 0.3599 | 28.0102 | 1363 | 2.5853 | 0.36 |
|
83 |
+
| 0.399 | 29.0102 | 1410 | 3.3633 | 0.4 |
|
84 |
+
| 0.205 | 30.0102 | 1457 | 3.0276 | 0.32 |
|
85 |
+
| 0.0945 | 31.0102 | 1504 | 3.3960 | 0.4 |
|
86 |
+
| 0.3376 | 32.0102 | 1551 | 3.0445 | 0.32 |
|
87 |
+
| 0.2407 | 33.0102 | 1598 | 2.8461 | 0.32 |
|
88 |
+
| 0.1653 | 34.0102 | 1645 | 3.1737 | 0.36 |
|
89 |
+
| 0.187 | 35.0102 | 1692 | 3.5642 | 0.32 |
|
90 |
+
| 0.2339 | 36.0102 | 1739 | 3.6020 | 0.4 |
|
91 |
+
| 0.1097 | 37.0102 | 1786 | 3.5631 | 0.4 |
|
92 |
+
| 0.2859 | 38.0102 | 1833 | 3.6048 | 0.36 |
|
93 |
+
| 0.0123 | 39.0102 | 1880 | 4.2022 | 0.4 |
|
94 |
+
| 0.0062 | 40.0102 | 1927 | 4.2564 | 0.36 |
|
95 |
+
| 0.031 | 41.0102 | 1974 | 4.0465 | 0.4 |
|
96 |
+
| 0.1045 | 42.0102 | 2021 | 3.5379 | 0.36 |
|
97 |
+
| 0.0025 | 43.0102 | 2068 | 4.1880 | 0.4 |
|
98 |
+
| 0.2103 | 44.0102 | 2115 | 4.4486 | 0.32 |
|
99 |
+
| 0.0035 | 45.0102 | 2162 | 3.7883 | 0.32 |
|
100 |
+
| 0.0117 | 46.0102 | 2209 | 3.8264 | 0.44 |
|
101 |
+
| 0.0027 | 47.0102 | 2256 | 4.2371 | 0.32 |
|
102 |
+
| 0.0174 | 48.0102 | 2303 | 4.0451 | 0.4 |
|
103 |
+
| 0.0199 | 49.0102 | 2350 | 4.0996 | 0.4 |
|
104 |
+
| 0.0082 | 50.0102 | 2397 | 4.5682 | 0.36 |
|
105 |
+
| 0.0186 | 51.0102 | 2444 | 4.0036 | 0.36 |
|
106 |
+
| 0.1483 | 52.0102 | 2491 | 3.8019 | 0.36 |
|
107 |
+
| 0.1276 | 53.0102 | 2538 | 3.9253 | 0.4 |
|
108 |
+
| 0.0601 | 54.0102 | 2585 | 4.5047 | 0.4 |
|
109 |
+
| 0.0027 | 55.0102 | 2632 | 4.5747 | 0.36 |
|
110 |
+
| 0.0055 | 56.0102 | 2679 | 4.2363 | 0.32 |
|
111 |
+
| 0.0338 | 57.0102 | 2726 | 4.3328 | 0.36 |
|
112 |
+
| 0.0005 | 58.0102 | 2773 | 4.5897 | 0.36 |
|
113 |
+
| 0.0489 | 59.0102 | 2820 | 4.7412 | 0.32 |
|
114 |
+
| 0.11 | 60.0102 | 2867 | 4.7991 | 0.36 |
|
115 |
+
| 0.0006 | 61.0102 | 2914 | 4.8250 | 0.32 |
|
116 |
+
| 0.0008 | 62.0102 | 2961 | 4.7567 | 0.32 |
|
117 |
+
| 0.0004 | 63.0102 | 3008 | 4.4867 | 0.36 |
|
118 |
+
| 0.0877 | 64.0102 | 3055 | 4.8180 | 0.36 |
|
119 |
+
| 0.0009 | 65.0102 | 3102 | 4.3209 | 0.4 |
|
120 |
+
| 0.0004 | 66.0102 | 3149 | 4.3730 | 0.36 |
|
121 |
+
| 0.0005 | 67.0102 | 3196 | 4.0573 | 0.44 |
|
122 |
+
| 0.0288 | 68.0102 | 3243 | 3.7278 | 0.44 |
|
123 |
+
| 0.0014 | 69.0102 | 3290 | 4.9681 | 0.36 |
|
124 |
+
| 0.0002 | 70.0102 | 3337 | 4.8522 | 0.4 |
|
125 |
+
| 0.0009 | 71.0102 | 3384 | 4.9470 | 0.4 |
|
126 |
+
| 0.0004 | 72.0102 | 3431 | 4.8706 | 0.36 |
|
127 |
+
| 0.0016 | 73.0102 | 3478 | 4.8785 | 0.32 |
|
128 |
+
| 0.0003 | 74.0102 | 3525 | 4.9980 | 0.36 |
|
129 |
+
| 0.0003 | 75.0102 | 3572 | 4.7280 | 0.36 |
|
130 |
+
| 0.0003 | 76.0102 | 3619 | 5.0809 | 0.36 |
|
131 |
+
| 0.0005 | 77.0102 | 3666 | 4.8118 | 0.4 |
|
132 |
+
| 0.0003 | 78.0102 | 3713 | 4.7439 | 0.4 |
|
133 |
+
| 0.0003 | 79.0102 | 3760 | 4.9703 | 0.4 |
|
134 |
+
| 0.0004 | 80.0102 | 3807 | 4.5657 | 0.4 |
|
135 |
+
| 0.0004 | 81.0102 | 3854 | 4.5084 | 0.44 |
|
136 |
+
| 0.1261 | 82.0102 | 3901 | 4.8884 | 0.44 |
|
137 |
+
| 0.0002 | 83.0102 | 3948 | 4.8646 | 0.4 |
|
138 |
+
| 0.0002 | 84.0102 | 3995 | 4.8225 | 0.4 |
|
139 |
+
| 0.0003 | 85.0102 | 4042 | 4.7205 | 0.4 |
|
140 |
+
| 0.0008 | 86.0102 | 4089 | 4.7888 | 0.44 |
|
141 |
+
| 0.0004 | 87.0102 | 4136 | 4.8506 | 0.44 |
|
142 |
+
| 0.0004 | 88.0102 | 4183 | 4.8165 | 0.44 |
|
143 |
+
| 0.0006 | 89.0102 | 4230 | 4.6865 | 0.44 |
|
144 |
+
| 0.0002 | 90.0102 | 4277 | 4.6192 | 0.4 |
|
145 |
+
| 0.0002 | 91.0102 | 4324 | 4.6489 | 0.4 |
|
146 |
+
| 0.0005 | 92.0102 | 4371 | 4.7074 | 0.4 |
|
147 |
+
| 0.0002 | 93.0102 | 4418 | 4.6926 | 0.4 |
|
148 |
+
| 0.0012 | 94.0102 | 4465 | 4.7289 | 0.36 |
|
149 |
+
| 0.0002 | 95.0102 | 4512 | 4.7505 | 0.36 |
|
150 |
+
| 0.0002 | 96.0102 | 4559 | 4.7498 | 0.36 |
|
151 |
+
| 0.0002 | 97.0089 | 4600 | 4.7523 | 0.36 |
|
152 |
+
|
153 |
+
|
154 |
+
### Framework versions
|
155 |
+
|
156 |
+
- Transformers 4.46.2
|
157 |
+
- Pytorch 2.0.1+cu117
|
158 |
+
- Datasets 3.0.1
|
159 |
+
- Tokenizers 0.20.0
|
model.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 1215504408
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:98247d2a769b525bb7853fbd35d646a66421d9394c33b17d97e9c54fbb0bbc00
|
3 |
size 1215504408
|