Melo1512 commited on
Commit
4db2ff6
·
verified ·
1 Parent(s): c0fa7ee

Model save

Browse files
Files changed (2) hide show
  1. README.md +107 -19
  2. model.safetensors +1 -1
README.md CHANGED
@@ -1,7 +1,7 @@
1
  ---
2
  library_name: transformers
3
  license: apache-2.0
4
- base_model: facebook/vit-msn-small
5
  tags:
6
  - generated_from_trainer
7
  datasets:
@@ -23,7 +23,7 @@ model-index:
23
  metrics:
24
  - name: Accuracy
25
  type: accuracy
26
- value: 0.8561096307575181
27
  ---
28
 
29
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -31,10 +31,10 @@ should probably proofread and complete it, then remove this comment. -->
31
 
32
  # vit-msn-small-wbc-classifier-0316-cleandataset-10
33
 
34
- This model is a fine-tuned version of [facebook/vit-msn-small](https://huggingface.co/facebook/vit-msn-small) on the imagefolder dataset.
35
  It achieves the following results on the evaluation set:
36
- - Loss: 0.3982
37
- - Accuracy: 0.8561
38
 
39
  ## Model description
40
 
@@ -53,7 +53,7 @@ More information needed
53
  ### Training hyperparameters
54
 
55
  The following hyperparameters were used during training:
56
- - learning_rate: 1e-05
57
  - train_batch_size: 64
58
  - eval_batch_size: 64
59
  - seed: 42
@@ -62,22 +62,110 @@ The following hyperparameters were used during training:
62
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
63
  - lr_scheduler_type: linear
64
  - lr_scheduler_warmup_ratio: 0.1
65
- - num_epochs: 10
66
 
67
  ### Training results
68
 
69
- | Training Loss | Epoch | Step | Validation Loss | Accuracy |
70
- |:-------------:|:------:|:----:|:---------------:|:--------:|
71
- | 1.3509 | 0.9730 | 18 | 0.7926 | 0.7876 |
72
- | 0.6001 | 2.0 | 37 | 0.5589 | 0.8085 |
73
- | 0.4944 | 2.9730 | 55 | 0.5092 | 0.8226 |
74
- | 0.462 | 4.0 | 74 | 0.4758 | 0.8260 |
75
- | 0.4367 | 4.9730 | 92 | 0.4534 | 0.8394 |
76
- | 0.4175 | 6.0 | 111 | 0.4492 | 0.8462 |
77
- | 0.4129 | 6.9730 | 129 | 0.4376 | 0.8451 |
78
- | 0.3956 | 8.0 | 148 | 0.4100 | 0.8515 |
79
- | 0.3668 | 8.9730 | 166 | 0.4213 | 0.8496 |
80
- | 0.3752 | 9.7297 | 180 | 0.3982 | 0.8561 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
81
 
82
 
83
  ### Framework versions
 
1
  ---
2
  library_name: transformers
3
  license: apache-2.0
4
+ base_model: Melo1512/vit-msn-small-wbc-classifier-0316-cleandataset-10
5
  tags:
6
  - generated_from_trainer
7
  datasets:
 
23
  metrics:
24
  - name: Accuracy
25
  type: accuracy
26
+ value: 0.8591549295774648
27
  ---
28
 
29
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
31
 
32
  # vit-msn-small-wbc-classifier-0316-cleandataset-10
33
 
34
+ This model is a fine-tuned version of [Melo1512/vit-msn-small-wbc-classifier-0316-cleandataset-10](https://huggingface.co/Melo1512/vit-msn-small-wbc-classifier-0316-cleandataset-10) on the imagefolder dataset.
35
  It achieves the following results on the evaluation set:
36
+ - Loss: 0.3934
37
+ - Accuracy: 0.8592
38
 
39
  ## Model description
40
 
 
53
  ### Training hyperparameters
54
 
55
  The following hyperparameters were used during training:
56
+ - learning_rate: 1e-07
57
  - train_batch_size: 64
58
  - eval_batch_size: 64
59
  - seed: 42
 
62
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
63
  - lr_scheduler_type: linear
64
  - lr_scheduler_warmup_ratio: 0.1
65
+ - num_epochs: 100
66
 
67
  ### Training results
68
 
69
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
70
+ |:-------------:|:-------:|:----:|:---------------:|:--------:|
71
+ | 0.3785 | 0.9730 | 18 | 0.3985 | 0.8569 |
72
+ | 0.3432 | 2.0 | 37 | 0.3996 | 0.8557 |
73
+ | 0.3454 | 2.9730 | 55 | 0.4011 | 0.8553 |
74
+ | 0.3639 | 4.0 | 74 | 0.4034 | 0.8538 |
75
+ | 0.3544 | 4.9730 | 92 | 0.4049 | 0.8546 |
76
+ | 0.3607 | 6.0 | 111 | 0.4057 | 0.8538 |
77
+ | 0.3652 | 6.9730 | 129 | 0.4046 | 0.8561 |
78
+ | 0.3639 | 8.0 | 148 | 0.4046 | 0.8553 |
79
+ | 0.3472 | 8.9730 | 166 | 0.4048 | 0.8561 |
80
+ | 0.3704 | 10.0 | 185 | 0.4033 | 0.8546 |
81
+ | 0.3954 | 10.9730 | 203 | 0.4009 | 0.8565 |
82
+ | 0.372 | 12.0 | 222 | 0.4022 | 0.8546 |
83
+ | 0.3599 | 12.9730 | 240 | 0.4005 | 0.8561 |
84
+ | 0.3689 | 14.0 | 259 | 0.4018 | 0.8550 |
85
+ | 0.3687 | 14.9730 | 277 | 0.4016 | 0.8553 |
86
+ | 0.3521 | 16.0 | 296 | 0.4000 | 0.8561 |
87
+ | 0.3817 | 16.9730 | 314 | 0.4001 | 0.8553 |
88
+ | 0.3768 | 18.0 | 333 | 0.3994 | 0.8550 |
89
+ | 0.3835 | 18.9730 | 351 | 0.4041 | 0.8546 |
90
+ | 0.3833 | 20.0 | 370 | 0.4042 | 0.8553 |
91
+ | 0.36 | 20.9730 | 388 | 0.4012 | 0.8561 |
92
+ | 0.3729 | 22.0 | 407 | 0.4023 | 0.8565 |
93
+ | 0.3647 | 22.9730 | 425 | 0.4029 | 0.8546 |
94
+ | 0.3811 | 24.0 | 444 | 0.4011 | 0.8561 |
95
+ | 0.38 | 24.9730 | 462 | 0.3999 | 0.8569 |
96
+ | 0.3588 | 26.0 | 481 | 0.3994 | 0.8557 |
97
+ | 0.3554 | 26.9730 | 499 | 0.3991 | 0.8561 |
98
+ | 0.354 | 28.0 | 518 | 0.3995 | 0.8561 |
99
+ | 0.3577 | 28.9730 | 536 | 0.3986 | 0.8557 |
100
+ | 0.3723 | 30.0 | 555 | 0.3998 | 0.8561 |
101
+ | 0.3763 | 30.9730 | 573 | 0.3994 | 0.8561 |
102
+ | 0.3701 | 32.0 | 592 | 0.3994 | 0.8569 |
103
+ | 0.3728 | 32.9730 | 610 | 0.3980 | 0.8553 |
104
+ | 0.3649 | 34.0 | 629 | 0.3964 | 0.8565 |
105
+ | 0.3551 | 34.9730 | 647 | 0.3982 | 0.8569 |
106
+ | 0.3832 | 36.0 | 666 | 0.3977 | 0.8576 |
107
+ | 0.3459 | 36.9730 | 684 | 0.3968 | 0.8561 |
108
+ | 0.3613 | 38.0 | 703 | 0.3966 | 0.8561 |
109
+ | 0.3588 | 38.9730 | 721 | 0.3968 | 0.8565 |
110
+ | 0.3483 | 40.0 | 740 | 0.3958 | 0.8573 |
111
+ | 0.3693 | 40.9730 | 758 | 0.3967 | 0.8576 |
112
+ | 0.3544 | 42.0 | 777 | 0.3988 | 0.8576 |
113
+ | 0.3701 | 42.9730 | 795 | 0.3976 | 0.8573 |
114
+ | 0.3649 | 44.0 | 814 | 0.3984 | 0.8565 |
115
+ | 0.3621 | 44.9730 | 832 | 0.3966 | 0.8573 |
116
+ | 0.3494 | 46.0 | 851 | 0.3989 | 0.8573 |
117
+ | 0.373 | 46.9730 | 869 | 0.3993 | 0.8573 |
118
+ | 0.3911 | 48.0 | 888 | 0.3978 | 0.8576 |
119
+ | 0.3716 | 48.9730 | 906 | 0.3967 | 0.8576 |
120
+ | 0.3685 | 50.0 | 925 | 0.3968 | 0.8576 |
121
+ | 0.3879 | 50.9730 | 943 | 0.3950 | 0.8573 |
122
+ | 0.3774 | 52.0 | 962 | 0.3951 | 0.8580 |
123
+ | 0.3588 | 52.9730 | 980 | 0.3950 | 0.8584 |
124
+ | 0.3746 | 54.0 | 999 | 0.3959 | 0.8584 |
125
+ | 0.3677 | 54.9730 | 1017 | 0.3960 | 0.8584 |
126
+ | 0.3608 | 56.0 | 1036 | 0.3965 | 0.8588 |
127
+ | 0.3518 | 56.9730 | 1054 | 0.3963 | 0.8580 |
128
+ | 0.3554 | 58.0 | 1073 | 0.3957 | 0.8588 |
129
+ | 0.3584 | 58.9730 | 1091 | 0.3957 | 0.8584 |
130
+ | 0.3776 | 60.0 | 1110 | 0.3948 | 0.8592 |
131
+ | 0.364 | 60.9730 | 1128 | 0.3942 | 0.8588 |
132
+ | 0.3647 | 62.0 | 1147 | 0.3942 | 0.8584 |
133
+ | 0.3613 | 62.9730 | 1165 | 0.3949 | 0.8588 |
134
+ | 0.3509 | 64.0 | 1184 | 0.3961 | 0.8584 |
135
+ | 0.3816 | 64.9730 | 1202 | 0.3967 | 0.8584 |
136
+ | 0.3552 | 66.0 | 1221 | 0.3957 | 0.8588 |
137
+ | 0.3461 | 66.9730 | 1239 | 0.3946 | 0.8588 |
138
+ | 0.364 | 68.0 | 1258 | 0.3940 | 0.8588 |
139
+ | 0.372 | 68.9730 | 1276 | 0.3943 | 0.8599 |
140
+ | 0.347 | 70.0 | 1295 | 0.3939 | 0.8592 |
141
+ | 0.3537 | 70.9730 | 1313 | 0.3943 | 0.8599 |
142
+ | 0.3537 | 72.0 | 1332 | 0.3950 | 0.8595 |
143
+ | 0.3823 | 72.9730 | 1350 | 0.3951 | 0.8592 |
144
+ | 0.3454 | 74.0 | 1369 | 0.3947 | 0.8592 |
145
+ | 0.3667 | 74.9730 | 1387 | 0.3949 | 0.8592 |
146
+ | 0.3585 | 76.0 | 1406 | 0.3945 | 0.8592 |
147
+ | 0.356 | 76.9730 | 1424 | 0.3947 | 0.8592 |
148
+ | 0.337 | 78.0 | 1443 | 0.3949 | 0.8592 |
149
+ | 0.3588 | 78.9730 | 1461 | 0.3944 | 0.8592 |
150
+ | 0.3591 | 80.0 | 1480 | 0.3941 | 0.8592 |
151
+ | 0.3638 | 80.9730 | 1498 | 0.3943 | 0.8592 |
152
+ | 0.367 | 82.0 | 1517 | 0.3941 | 0.8592 |
153
+ | 0.3694 | 82.9730 | 1535 | 0.3943 | 0.8592 |
154
+ | 0.3779 | 84.0 | 1554 | 0.3941 | 0.8592 |
155
+ | 0.344 | 84.9730 | 1572 | 0.3939 | 0.8595 |
156
+ | 0.3619 | 86.0 | 1591 | 0.3935 | 0.8592 |
157
+ | 0.342 | 86.9730 | 1609 | 0.3934 | 0.8595 |
158
+ | 0.3686 | 88.0 | 1628 | 0.3931 | 0.8595 |
159
+ | 0.3407 | 88.9730 | 1646 | 0.3931 | 0.8595 |
160
+ | 0.3553 | 90.0 | 1665 | 0.3933 | 0.8599 |
161
+ | 0.367 | 90.9730 | 1683 | 0.3934 | 0.8595 |
162
+ | 0.3665 | 92.0 | 1702 | 0.3932 | 0.8599 |
163
+ | 0.3684 | 92.9730 | 1720 | 0.3932 | 0.8599 |
164
+ | 0.3685 | 94.0 | 1739 | 0.3934 | 0.8595 |
165
+ | 0.375 | 94.9730 | 1757 | 0.3934 | 0.8592 |
166
+ | 0.3564 | 96.0 | 1776 | 0.3934 | 0.8592 |
167
+ | 0.362 | 96.9730 | 1794 | 0.3934 | 0.8592 |
168
+ | 0.3688 | 97.2973 | 1800 | 0.3934 | 0.8592 |
169
 
170
 
171
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:84e85e6885abc567c7510e90e81673285b28de0b75650cc341b738633af13629
3
  size 86691704
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:af1831e2c55f936332d434822947323429166ba39db7be9aba7d6db6ad6185af
3
  size 86691704