pradanaadn commited on
Commit
ca794d4
·
verified ·
1 Parent(s): ac2bcd5

End of training

Browse files
README.md CHANGED
@@ -1,10 +1,11 @@
1
  ---
2
  license: apache-2.0
3
- base_model: distilbert/distilbert-base-uncased-finetuned-sst-2-english
4
  tags:
5
  - generated_from_trainer
6
  metrics:
7
  - accuracy
 
8
  model-index:
9
  - name: sucidal-text-classification-distillbert
10
  results: []
@@ -15,11 +16,11 @@ should probably proofread and complete it, then remove this comment. -->
15
 
16
  # sucidal-text-classification-distillbert
17
 
18
- This model is a fine-tuned version of [distilbert/distilbert-base-uncased-finetuned-sst-2-english](https://huggingface.co/distilbert/distilbert-base-uncased-finetuned-sst-2-english) on an unknown dataset.
19
  It achieves the following results on the evaluation set:
20
- - Loss: 0.4590
21
- - Accuracy: 0.8198
22
- - F1 Score: 0.8198
23
 
24
  ## Model description
25
 
@@ -39,8 +40,8 @@ More information needed
39
 
40
  The following hyperparameters were used during training:
41
  - learning_rate: 5e-05
42
- - train_batch_size: 16
43
- - eval_batch_size: 16
44
  - seed: 42
45
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
46
  - lr_scheduler_type: linear
@@ -49,28 +50,15 @@ The following hyperparameters were used during training:
49
 
50
  ### Training results
51
 
52
- | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 Score |
53
- |:-------------:|:------:|:----:|:---------------:|:--------:|:--------:|
54
- | 0.3602 | 0.1885 | 500 | 0.7867 | 0.7670 | 0.7670 |
55
- | 0.6285 | 0.3769 | 1000 | 0.5574 | 0.7795 | 0.7795 |
56
- | 0.5624 | 0.5654 | 1500 | 0.5011 | 0.7988 | 0.7988 |
57
- | 0.5413 | 0.7539 | 2000 | 0.4968 | 0.8017 | 0.8017 |
58
- | 0.5084 | 0.9423 | 2500 | 0.4712 | 0.8085 | 0.8085 |
59
- | 0.4253 | 1.1308 | 3000 | 0.4938 | 0.8053 | 0.8053 |
60
- | 0.3915 | 1.3193 | 3500 | 0.4781 | 0.8136 | 0.8136 |
61
- | 0.3739 | 1.5077 | 4000 | 0.5195 | 0.8043 | 0.8043 |
62
- | 0.3638 | 1.6962 | 4500 | 0.4790 | 0.8201 | 0.8201 |
63
- | 0.3667 | 1.8847 | 5000 | 0.4590 | 0.8198 | 0.8198 |
64
- | 0.3182 | 2.0731 | 5500 | 0.5129 | 0.8218 | 0.8218 |
65
- | 0.2325 | 2.2616 | 6000 | 0.5279 | 0.8198 | 0.8198 |
66
- | 0.2318 | 2.4501 | 6500 | 0.5368 | 0.8197 | 0.8197 |
67
- | 0.2219 | 2.6385 | 7000 | 0.5606 | 0.8221 | 0.8221 |
68
- | 0.2261 | 2.8270 | 7500 | 0.5406 | 0.8229 | 0.8229 |
69
 
70
 
71
  ### Framework versions
72
 
73
  - Transformers 4.41.2
74
  - Pytorch 2.3.0+cu121
75
- - Datasets 2.20.0
76
  - Tokenizers 0.19.1
 
1
  ---
2
  license: apache-2.0
3
+ base_model: pradanaadn/sucidal-text-classification-distillbert
4
  tags:
5
  - generated_from_trainer
6
  metrics:
7
  - accuracy
8
+ - f1
9
  model-index:
10
  - name: sucidal-text-classification-distillbert
11
  results: []
 
16
 
17
  # sucidal-text-classification-distillbert
18
 
19
+ This model is a fine-tuned version of [pradanaadn/sucidal-text-classification-distillbert](https://huggingface.co/pradanaadn/sucidal-text-classification-distillbert) on an unknown dataset.
20
  It achieves the following results on the evaluation set:
21
+ - Loss: 0.5138
22
+ - Accuracy: 0.8106
23
+ - F1: 0.7801
24
 
25
  ## Model description
26
 
 
40
 
41
  The following hyperparameters were used during training:
42
  - learning_rate: 5e-05
43
+ - train_batch_size: 64
44
+ - eval_batch_size: 64
45
  - seed: 42
46
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
47
  - lr_scheduler_type: linear
 
50
 
51
  ### Training results
52
 
53
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
54
+ |:-------------:|:------:|:----:|:---------------:|:--------:|:------:|
55
+ | 0.2549 | 0.7530 | 500 | 0.5138 | 0.8106 | 0.7801 |
56
+ | 0.2054 | 1.5060 | 1000 | 0.6016 | 0.8142 | 0.7877 |
57
+ | 0.1666 | 2.2590 | 1500 | 0.6464 | 0.8164 | 0.7896 |
 
 
 
 
 
 
 
 
 
 
 
 
58
 
59
 
60
  ### Framework versions
61
 
62
  - Transformers 4.41.2
63
  - Pytorch 2.3.0+cu121
 
64
  - Tokenizers 0.19.1
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:d8ea6590f8b5fb67940187d375fcc3fc1c4bd0126e2e6a752c59d515ccf5d6bc
3
  size 267847948
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0541f2032351b41e025e90120b2ef5dbfa5ff3be35e9c3260a984ca333d655b9
3
  size 267847948
runs/Jul16_15-16-06_5cd76cd4e441/events.out.tfevents.1721143016.5cd76cd4e441.206.1 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:d48238be273eab3d5fa6a7cd23f13b676bc8977cae74d6081adbf89429022fe5
3
- size 7285
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ba4e4c4558047910e6eb3ec227873f03152426b2ebdd022240df70a02c1549ba
3
+ size 7639
runs/Jul16_15-16-06_5cd76cd4e441/events.out.tfevents.1721144563.5cd76cd4e441.206.2 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f4151274781d2ce40799aa8294f4190bdf0ef09017f90680fa821ec268a2ee06
3
+ size 457