desarrolloasesoreslocales commited on
Commit
283544f
·
verified ·
1 Parent(s): b220ef9

Model save

Browse files
Files changed (2) hide show
  1. README.md +101 -101
  2. model.safetensors +1 -1
README.md CHANGED
@@ -33,7 +33,7 @@ should probably proofread and complete it, then remove this comment. -->
33
 
34
  This model is a fine-tuned version of [google/efficientnet-b0](https://huggingface.co/google/efficientnet-b0) on the imagefolder dataset.
35
  It achieves the following results on the evaluation set:
36
- - Loss: 0.4519
37
  - Accuracy: 0.8367
38
 
39
  ## Model description
@@ -68,106 +68,106 @@ The following hyperparameters were used during training:
68
 
69
  | Training Loss | Epoch | Step | Validation Loss | Accuracy |
70
  |:-------------:|:-----:|:----:|:---------------:|:--------:|
71
- | No log | 1.0 | 1 | 0.7069 | 0.4898 |
72
- | No log | 2.0 | 2 | 0.7113 | 0.4898 |
73
- | No log | 3.0 | 3 | 0.7034 | 0.5102 |
74
- | No log | 4.0 | 4 | 0.6950 | 0.5408 |
75
- | No log | 5.0 | 5 | 0.6942 | 0.5612 |
76
- | No log | 6.0 | 6 | 0.6847 | 0.5306 |
77
- | No log | 7.0 | 7 | 0.6866 | 0.5204 |
78
- | No log | 8.0 | 8 | 0.6743 | 0.5816 |
79
- | No log | 9.0 | 9 | 0.6717 | 0.5714 |
80
- | 0.6848 | 10.0 | 10 | 0.6548 | 0.6429 |
81
- | 0.6848 | 11.0 | 11 | 0.6551 | 0.6020 |
82
- | 0.6848 | 12.0 | 12 | 0.6482 | 0.5918 |
83
- | 0.6848 | 13.0 | 13 | 0.6407 | 0.6633 |
84
- | 0.6848 | 14.0 | 14 | 0.6282 | 0.7041 |
85
- | 0.6848 | 15.0 | 15 | 0.6241 | 0.6837 |
86
- | 0.6848 | 16.0 | 16 | 0.6189 | 0.7143 |
87
- | 0.6848 | 17.0 | 17 | 0.5998 | 0.6837 |
88
- | 0.6848 | 18.0 | 18 | 0.6091 | 0.6633 |
89
- | 0.6848 | 19.0 | 19 | 0.5962 | 0.7143 |
90
- | 0.6177 | 20.0 | 20 | 0.5860 | 0.6939 |
91
- | 0.6177 | 21.0 | 21 | 0.6066 | 0.6837 |
92
- | 0.6177 | 22.0 | 22 | 0.5802 | 0.7449 |
93
- | 0.6177 | 23.0 | 23 | 0.5803 | 0.7449 |
94
- | 0.6177 | 24.0 | 24 | 0.5718 | 0.7551 |
95
- | 0.6177 | 25.0 | 25 | 0.5688 | 0.7041 |
96
- | 0.6177 | 26.0 | 26 | 0.5589 | 0.7245 |
97
- | 0.6177 | 27.0 | 27 | 0.5468 | 0.7245 |
98
- | 0.6177 | 28.0 | 28 | 0.5389 | 0.6939 |
99
- | 0.6177 | 29.0 | 29 | 0.5337 | 0.7551 |
100
- | 0.5545 | 30.0 | 30 | 0.5289 | 0.7347 |
101
- | 0.5545 | 31.0 | 31 | 0.5381 | 0.7449 |
102
- | 0.5545 | 32.0 | 32 | 0.5214 | 0.7857 |
103
- | 0.5545 | 33.0 | 33 | 0.5152 | 0.7347 |
104
- | 0.5545 | 34.0 | 34 | 0.5096 | 0.7041 |
105
- | 0.5545 | 35.0 | 35 | 0.5266 | 0.6837 |
106
- | 0.5545 | 36.0 | 36 | 0.5111 | 0.7959 |
107
- | 0.5545 | 37.0 | 37 | 0.4954 | 0.7551 |
108
- | 0.5545 | 38.0 | 38 | 0.5109 | 0.7551 |
109
- | 0.5545 | 39.0 | 39 | 0.4929 | 0.7653 |
110
- | 0.5215 | 40.0 | 40 | 0.4866 | 0.7857 |
111
- | 0.5215 | 41.0 | 41 | 0.4810 | 0.7245 |
112
- | 0.5215 | 42.0 | 42 | 0.4812 | 0.7755 |
113
- | 0.5215 | 43.0 | 43 | 0.4918 | 0.7857 |
114
- | 0.5215 | 44.0 | 44 | 0.4653 | 0.7857 |
115
- | 0.5215 | 45.0 | 45 | 0.4883 | 0.7347 |
116
- | 0.5215 | 46.0 | 46 | 0.4795 | 0.7959 |
117
- | 0.5215 | 47.0 | 47 | 0.5048 | 0.7551 |
118
- | 0.5215 | 48.0 | 48 | 0.4865 | 0.7755 |
119
- | 0.5215 | 49.0 | 49 | 0.4519 | 0.8367 |
120
- | 0.4859 | 50.0 | 50 | 0.4708 | 0.7551 |
121
- | 0.4859 | 51.0 | 51 | 0.4428 | 0.8367 |
122
- | 0.4859 | 52.0 | 52 | 0.4502 | 0.8163 |
123
- | 0.4859 | 53.0 | 53 | 0.4585 | 0.7857 |
124
- | 0.4859 | 54.0 | 54 | 0.4413 | 0.8163 |
125
- | 0.4859 | 55.0 | 55 | 0.4473 | 0.7857 |
126
- | 0.4859 | 56.0 | 56 | 0.4423 | 0.7959 |
127
- | 0.4859 | 57.0 | 57 | 0.4411 | 0.8061 |
128
- | 0.4859 | 58.0 | 58 | 0.4363 | 0.8265 |
129
- | 0.4859 | 59.0 | 59 | 0.4597 | 0.7857 |
130
- | 0.4398 | 60.0 | 60 | 0.4169 | 0.8061 |
131
- | 0.4398 | 61.0 | 61 | 0.4281 | 0.8265 |
132
- | 0.4398 | 62.0 | 62 | 0.4187 | 0.8061 |
133
- | 0.4398 | 63.0 | 63 | 0.4358 | 0.8061 |
134
- | 0.4398 | 64.0 | 64 | 0.4351 | 0.7653 |
135
- | 0.4398 | 65.0 | 65 | 0.4330 | 0.7959 |
136
- | 0.4398 | 66.0 | 66 | 0.4066 | 0.8265 |
137
- | 0.4398 | 67.0 | 67 | 0.4285 | 0.8163 |
138
- | 0.4398 | 68.0 | 68 | 0.4496 | 0.7653 |
139
- | 0.4398 | 69.0 | 69 | 0.3974 | 0.8265 |
140
- | 0.4268 | 70.0 | 70 | 0.3984 | 0.8265 |
141
- | 0.4268 | 71.0 | 71 | 0.4166 | 0.8061 |
142
- | 0.4268 | 72.0 | 72 | 0.4205 | 0.8163 |
143
- | 0.4268 | 73.0 | 73 | 0.4390 | 0.7959 |
144
- | 0.4268 | 74.0 | 74 | 0.4198 | 0.8265 |
145
- | 0.4268 | 75.0 | 75 | 0.3999 | 0.8367 |
146
- | 0.4268 | 76.0 | 76 | 0.4235 | 0.7857 |
147
- | 0.4268 | 77.0 | 77 | 0.4314 | 0.7755 |
148
- | 0.4268 | 78.0 | 78 | 0.4021 | 0.8367 |
149
- | 0.4268 | 79.0 | 79 | 0.4189 | 0.8163 |
150
- | 0.4185 | 80.0 | 80 | 0.4143 | 0.8163 |
151
- | 0.4185 | 81.0 | 81 | 0.4329 | 0.7959 |
152
- | 0.4185 | 82.0 | 82 | 0.4223 | 0.8061 |
153
- | 0.4185 | 83.0 | 83 | 0.4025 | 0.8265 |
154
- | 0.4185 | 84.0 | 84 | 0.4288 | 0.7857 |
155
- | 0.4185 | 85.0 | 85 | 0.4216 | 0.8163 |
156
- | 0.4185 | 86.0 | 86 | 0.4140 | 0.8265 |
157
- | 0.4185 | 87.0 | 87 | 0.3982 | 0.8367 |
158
- | 0.4185 | 88.0 | 88 | 0.4085 | 0.8163 |
159
- | 0.4185 | 89.0 | 89 | 0.4293 | 0.8163 |
160
- | 0.4034 | 90.0 | 90 | 0.3912 | 0.8265 |
161
- | 0.4034 | 91.0 | 91 | 0.4017 | 0.8163 |
162
- | 0.4034 | 92.0 | 92 | 0.4331 | 0.8061 |
163
- | 0.4034 | 93.0 | 93 | 0.4054 | 0.7959 |
164
- | 0.4034 | 94.0 | 94 | 0.3894 | 0.8367 |
165
- | 0.4034 | 95.0 | 95 | 0.4080 | 0.8265 |
166
- | 0.4034 | 96.0 | 96 | 0.4017 | 0.8265 |
167
- | 0.4034 | 97.0 | 97 | 0.4095 | 0.8367 |
168
- | 0.4034 | 98.0 | 98 | 0.4275 | 0.8163 |
169
- | 0.4034 | 99.0 | 99 | 0.4001 | 0.8265 |
170
- | 0.3955 | 100.0 | 100 | 0.4125 | 0.8061 |
171
 
172
 
173
  ### Framework versions
 
33
 
34
  This model is a fine-tuned version of [google/efficientnet-b0](https://huggingface.co/google/efficientnet-b0) on the imagefolder dataset.
35
  It achieves the following results on the evaluation set:
36
+ - Loss: 0.3330
37
  - Accuracy: 0.8367
38
 
39
  ## Model description
 
68
 
69
  | Training Loss | Epoch | Step | Validation Loss | Accuracy |
70
  |:-------------:|:-----:|:----:|:---------------:|:--------:|
71
+ | No log | 1.0 | 1 | 0.4780 | 0.7959 |
72
+ | No log | 2.0 | 2 | 0.4610 | 0.7959 |
73
+ | No log | 3.0 | 3 | 0.4518 | 0.7755 |
74
+ | No log | 4.0 | 4 | 0.4450 | 0.8163 |
75
+ | No log | 5.0 | 5 | 0.4424 | 0.8265 |
76
+ | No log | 6.0 | 6 | 0.4483 | 0.7959 |
77
+ | No log | 7.0 | 7 | 0.4533 | 0.7959 |
78
+ | No log | 8.0 | 8 | 0.4557 | 0.7959 |
79
+ | No log | 9.0 | 9 | 0.4556 | 0.8265 |
80
+ | 0.4528 | 10.0 | 10 | 0.4453 | 0.8163 |
81
+ | 0.4528 | 11.0 | 11 | 0.4558 | 0.7755 |
82
+ | 0.4528 | 12.0 | 12 | 0.4390 | 0.8265 |
83
+ | 0.4528 | 13.0 | 13 | 0.4322 | 0.7959 |
84
+ | 0.4528 | 14.0 | 14 | 0.4323 | 0.8163 |
85
+ | 0.4528 | 15.0 | 15 | 0.4127 | 0.8061 |
86
+ | 0.4528 | 16.0 | 16 | 0.4341 | 0.8061 |
87
+ | 0.4528 | 17.0 | 17 | 0.4144 | 0.8265 |
88
+ | 0.4528 | 18.0 | 18 | 0.4275 | 0.8265 |
89
+ | 0.4528 | 19.0 | 19 | 0.3988 | 0.8673 |
90
+ | 0.4233 | 20.0 | 20 | 0.4210 | 0.7959 |
91
+ | 0.4233 | 21.0 | 21 | 0.4223 | 0.7755 |
92
+ | 0.4233 | 22.0 | 22 | 0.4288 | 0.8265 |
93
+ | 0.4233 | 23.0 | 23 | 0.3851 | 0.8571 |
94
+ | 0.4233 | 24.0 | 24 | 0.3956 | 0.8061 |
95
+ | 0.4233 | 25.0 | 25 | 0.4159 | 0.8367 |
96
+ | 0.4233 | 26.0 | 26 | 0.4055 | 0.8163 |
97
+ | 0.4233 | 27.0 | 27 | 0.3861 | 0.8163 |
98
+ | 0.4233 | 28.0 | 28 | 0.3751 | 0.8469 |
99
+ | 0.4233 | 29.0 | 29 | 0.3915 | 0.8367 |
100
+ | 0.3846 | 30.0 | 30 | 0.3705 | 0.8571 |
101
+ | 0.3846 | 31.0 | 31 | 0.3868 | 0.8367 |
102
+ | 0.3846 | 32.0 | 32 | 0.3710 | 0.8469 |
103
+ | 0.3846 | 33.0 | 33 | 0.3770 | 0.8469 |
104
+ | 0.3846 | 34.0 | 34 | 0.3903 | 0.8265 |
105
+ | 0.3846 | 35.0 | 35 | 0.3864 | 0.8469 |
106
+ | 0.3846 | 36.0 | 36 | 0.3728 | 0.8265 |
107
+ | 0.3846 | 37.0 | 37 | 0.3772 | 0.8367 |
108
+ | 0.3846 | 38.0 | 38 | 0.3633 | 0.8163 |
109
+ | 0.3846 | 39.0 | 39 | 0.3824 | 0.8469 |
110
+ | 0.3714 | 40.0 | 40 | 0.3520 | 0.8571 |
111
+ | 0.3714 | 41.0 | 41 | 0.3844 | 0.8469 |
112
+ | 0.3714 | 42.0 | 42 | 0.3564 | 0.8469 |
113
+ | 0.3714 | 43.0 | 43 | 0.3747 | 0.8673 |
114
+ | 0.3714 | 44.0 | 44 | 0.3395 | 0.8571 |
115
+ | 0.3714 | 45.0 | 45 | 0.3871 | 0.8163 |
116
+ | 0.3714 | 46.0 | 46 | 0.3487 | 0.8367 |
117
+ | 0.3714 | 47.0 | 47 | 0.3798 | 0.8163 |
118
+ | 0.3714 | 48.0 | 48 | 0.3848 | 0.8367 |
119
+ | 0.3714 | 49.0 | 49 | 0.3978 | 0.8265 |
120
+ | 0.3618 | 50.0 | 50 | 0.3384 | 0.8571 |
121
+ | 0.3618 | 51.0 | 51 | 0.3647 | 0.8265 |
122
+ | 0.3618 | 52.0 | 52 | 0.3544 | 0.8571 |
123
+ | 0.3618 | 53.0 | 53 | 0.4289 | 0.8163 |
124
+ | 0.3618 | 54.0 | 54 | 0.3568 | 0.8673 |
125
+ | 0.3618 | 55.0 | 55 | 0.3727 | 0.8673 |
126
+ | 0.3618 | 56.0 | 56 | 0.3796 | 0.8265 |
127
+ | 0.3618 | 57.0 | 57 | 0.3678 | 0.8571 |
128
+ | 0.3618 | 58.0 | 58 | 0.3719 | 0.8469 |
129
+ | 0.3618 | 59.0 | 59 | 0.3808 | 0.8878 |
130
+ | 0.327 | 60.0 | 60 | 0.3783 | 0.8163 |
131
+ | 0.327 | 61.0 | 61 | 0.3637 | 0.8367 |
132
+ | 0.327 | 62.0 | 62 | 0.3743 | 0.8367 |
133
+ | 0.327 | 63.0 | 63 | 0.3554 | 0.8571 |
134
+ | 0.327 | 64.0 | 64 | 0.3544 | 0.8265 |
135
+ | 0.327 | 65.0 | 65 | 0.3615 | 0.8469 |
136
+ | 0.327 | 66.0 | 66 | 0.3503 | 0.8673 |
137
+ | 0.327 | 67.0 | 67 | 0.3914 | 0.7959 |
138
+ | 0.327 | 68.0 | 68 | 0.3687 | 0.8367 |
139
+ | 0.327 | 69.0 | 69 | 0.3296 | 0.8878 |
140
+ | 0.3136 | 70.0 | 70 | 0.3548 | 0.8571 |
141
+ | 0.3136 | 71.0 | 71 | 0.3810 | 0.8265 |
142
+ | 0.3136 | 72.0 | 72 | 0.3522 | 0.8469 |
143
+ | 0.3136 | 73.0 | 73 | 0.3852 | 0.8367 |
144
+ | 0.3136 | 74.0 | 74 | 0.3434 | 0.8571 |
145
+ | 0.3136 | 75.0 | 75 | 0.3596 | 0.8571 |
146
+ | 0.3136 | 76.0 | 76 | 0.3551 | 0.8367 |
147
+ | 0.3136 | 77.0 | 77 | 0.4257 | 0.8163 |
148
+ | 0.3136 | 78.0 | 78 | 0.3554 | 0.8367 |
149
+ | 0.3136 | 79.0 | 79 | 0.3352 | 0.8265 |
150
+ | 0.316 | 80.0 | 80 | 0.3773 | 0.8367 |
151
+ | 0.316 | 81.0 | 81 | 0.3305 | 0.8469 |
152
+ | 0.316 | 82.0 | 82 | 0.3614 | 0.8571 |
153
+ | 0.316 | 83.0 | 83 | 0.3491 | 0.8265 |
154
+ | 0.316 | 84.0 | 84 | 0.3479 | 0.8571 |
155
+ | 0.316 | 85.0 | 85 | 0.3684 | 0.8367 |
156
+ | 0.316 | 86.0 | 86 | 0.3511 | 0.8571 |
157
+ | 0.316 | 87.0 | 87 | 0.3658 | 0.8265 |
158
+ | 0.316 | 88.0 | 88 | 0.3333 | 0.8367 |
159
+ | 0.316 | 89.0 | 89 | 0.3584 | 0.8776 |
160
+ | 0.3089 | 90.0 | 90 | 0.3277 | 0.8571 |
161
+ | 0.3089 | 91.0 | 91 | 0.3875 | 0.8367 |
162
+ | 0.3089 | 92.0 | 92 | 0.3757 | 0.8367 |
163
+ | 0.3089 | 93.0 | 93 | 0.3488 | 0.8367 |
164
+ | 0.3089 | 94.0 | 94 | 0.3282 | 0.8571 |
165
+ | 0.3089 | 95.0 | 95 | 0.3613 | 0.8571 |
166
+ | 0.3089 | 96.0 | 96 | 0.3753 | 0.8469 |
167
+ | 0.3089 | 97.0 | 97 | 0.3625 | 0.8469 |
168
+ | 0.3089 | 98.0 | 98 | 0.3930 | 0.8265 |
169
+ | 0.3089 | 99.0 | 99 | 0.3338 | 0.8469 |
170
+ | 0.3131 | 100.0 | 100 | 0.3330 | 0.8367 |
171
 
172
 
173
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:2d5b0602347c11ed0948db34ece84aab848cddb2f06ebc5592edadfc6baa1573
3
  size 16255128
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6d70e8f3b7a2a106e31c55f34851a33cfcbcf904df417754854a8ba6f047e2c5
3
  size 16255128