Kudod commited on
Commit
b715811
·
verified ·
1 Parent(s): ba774a1

End of training

Browse files
Files changed (2) hide show
  1. README.md +77 -0
  2. model.safetensors +1 -1
README.md ADDED
@@ -0,0 +1,77 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ base_model: Kudod/roberta-large-ner-ghtk-cs-6-label-old-data-3090-15Aug-2
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: roberta-large-ner-ghtk-cs-add-label-new-data-3090-15Aug-3
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # roberta-large-ner-ghtk-cs-add-label-new-data-3090-15Aug-3
15
+
16
+ This model is a fine-tuned version of [Kudod/roberta-large-ner-ghtk-cs-6-label-old-data-3090-15Aug-2](https://huggingface.co/Kudod/roberta-large-ner-ghtk-cs-6-label-old-data-3090-15Aug-2) on the None dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 0.2534
19
+ - Tk: {'precision': 0.9444444444444444, 'recall': 0.5862068965517241, 'f1': 0.7234042553191489, 'number': 116}
20
+ - Gày: {'precision': 0.627906976744186, 'recall': 0.8181818181818182, 'f1': 0.7105263157894738, 'number': 33}
21
+ - Gày trừu tượng: {'precision': 0.9060402684563759, 'recall': 0.867237687366167, 'f1': 0.886214442013129, 'number': 467}
22
+ - Iờ: {'precision': 0.4044943820224719, 'recall': 0.9473684210526315, 'f1': 0.5669291338582677, 'number': 38}
23
+ - Ã đơn: {'precision': 0.8076923076923077, 'recall': 0.7386934673366834, 'f1': 0.7716535433070866, 'number': 199}
24
+ - Đt: {'precision': 0.9152719665271967, 'recall': 0.9965831435079726, 'f1': 0.9541984732824427, 'number': 878}
25
+ - Đt trừu tượng: {'precision': 0.8480392156862745, 'recall': 0.8084112149532711, 'f1': 0.8277511961722488, 'number': 214}
26
+ - Overall Precision: 0.8685
27
+ - Overall Recall: 0.8900
28
+ - Overall F1: 0.8791
29
+ - Overall Accuracy: 0.9507
30
+
31
+ ## Model description
32
+
33
+ More information needed
34
+
35
+ ## Intended uses & limitations
36
+
37
+ More information needed
38
+
39
+ ## Training and evaluation data
40
+
41
+ More information needed
42
+
43
+ ## Training procedure
44
+
45
+ ### Training hyperparameters
46
+
47
+ The following hyperparameters were used during training:
48
+ - learning_rate: 2.5e-05
49
+ - train_batch_size: 16
50
+ - eval_batch_size: 16
51
+ - seed: 42
52
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
53
+ - lr_scheduler_type: linear
54
+ - num_epochs: 10
55
+
56
+ ### Training results
57
+
58
+ | Training Loss | Epoch | Step | Validation Loss | Tk | Gày | Gày trừu tượng | Iờ | Ã đơn | Đt | Đt trừu tượng | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
59
+ |:-------------:|:-----:|:----:|:---------------:|:--------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
60
+ | No log | 1.0 | 11 | 0.4222 | {'precision': 1.0, 'recall': 0.02586206896551724, 'f1': 0.050420168067226885, 'number': 116} | {'precision': 0.5, 'recall': 0.7575757575757576, 'f1': 0.6024096385542169, 'number': 33} | {'precision': 0.9420654911838791, 'recall': 0.8008565310492506, 'f1': 0.8657407407407407, 'number': 467} | {'precision': 0.08083832335329341, 'recall': 0.7105263157894737, 'f1': 0.14516129032258066, 'number': 38} | {'precision': 0.32, 'recall': 0.04020100502512563, 'f1': 0.07142857142857142, 'number': 199} | {'precision': 0.6460244648318043, 'recall': 0.9624145785876993, 'f1': 0.7731015553522415, 'number': 878} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 214} | 0.5873 | 0.6591 | 0.6211 | 0.8951 |
61
+ | No log | 2.0 | 22 | 0.2526 | {'precision': 1.0, 'recall': 0.1810344827586207, 'f1': 0.3065693430656934, 'number': 116} | {'precision': 0.5869565217391305, 'recall': 0.8181818181818182, 'f1': 0.6835443037974683, 'number': 33} | {'precision': 0.9214780600461894, 'recall': 0.854389721627409, 'f1': 0.8866666666666667, 'number': 467} | {'precision': 0.20710059171597633, 'recall': 0.9210526315789473, 'f1': 0.33816425120772947, 'number': 38} | {'precision': 0.780952380952381, 'recall': 0.8241206030150754, 'f1': 0.801955990220049, 'number': 199} | {'precision': 0.9180851063829787, 'recall': 0.9829157175398633, 'f1': 0.9493949394939494, 'number': 878} | {'precision': 1.0, 'recall': 0.04672897196261682, 'f1': 0.08928571428571429, 'number': 214} | 0.8305 | 0.7810 | 0.8050 | 0.9417 |
62
+ | No log | 3.0 | 33 | 0.2080 | {'precision': 0.9538461538461539, 'recall': 0.5344827586206896, 'f1': 0.6850828729281768, 'number': 116} | {'precision': 0.6944444444444444, 'recall': 0.7575757575757576, 'f1': 0.7246376811594203, 'number': 33} | {'precision': 0.910913140311804, 'recall': 0.8758029978586723, 'f1': 0.8930131004366811, 'number': 467} | {'precision': 0.25757575757575757, 'recall': 0.8947368421052632, 'f1': 0.4, 'number': 38} | {'precision': 0.8133971291866029, 'recall': 0.8542713567839196, 'f1': 0.8333333333333334, 'number': 199} | {'precision': 0.9142857142857143, 'recall': 0.9840546697038725, 'f1': 0.947888096544158, 'number': 878} | {'precision': 0.9649122807017544, 'recall': 0.514018691588785, 'f1': 0.6707317073170732, 'number': 214} | 0.8585 | 0.8607 | 0.8596 | 0.9534 |
63
+ | No log | 4.0 | 44 | 0.2246 | {'precision': 0.9444444444444444, 'recall': 0.5862068965517241, 'f1': 0.7234042553191489, 'number': 116} | {'precision': 0.5869565217391305, 'recall': 0.8181818181818182, 'f1': 0.6835443037974683, 'number': 33} | {'precision': 0.9151376146788991, 'recall': 0.854389721627409, 'f1': 0.883720930232558, 'number': 467} | {'precision': 0.32710280373831774, 'recall': 0.9210526315789473, 'f1': 0.4827586206896552, 'number': 38} | {'precision': 0.7637130801687764, 'recall': 0.9095477386934674, 'f1': 0.8302752293577983, 'number': 199} | {'precision': 0.9135932560590094, 'recall': 0.9874715261958997, 'f1': 0.9490968801313628, 'number': 878} | {'precision': 0.9354838709677419, 'recall': 0.677570093457944, 'f1': 0.7859078590785908, 'number': 214} | 0.8601 | 0.8853 | 0.8726 | 0.9528 |
64
+ | No log | 5.0 | 55 | 0.2362 | {'precision': 0.9577464788732394, 'recall': 0.5862068965517241, 'f1': 0.7272727272727272, 'number': 116} | {'precision': 0.675, 'recall': 0.8181818181818182, 'f1': 0.7397260273972603, 'number': 33} | {'precision': 0.940149625935162, 'recall': 0.8072805139186295, 'f1': 0.868663594470046, 'number': 467} | {'precision': 0.3894736842105263, 'recall': 0.9736842105263158, 'f1': 0.556390977443609, 'number': 38} | {'precision': 0.8042328042328042, 'recall': 0.7638190954773869, 'f1': 0.7835051546391751, 'number': 199} | {'precision': 0.9150052465897167, 'recall': 0.9931662870159453, 'f1': 0.9524849808847623, 'number': 878} | {'precision': 0.9367088607594937, 'recall': 0.6915887850467289, 'f1': 0.7956989247311828, 'number': 214} | 0.8815 | 0.8643 | 0.8728 | 0.9510 |
65
+ | No log | 6.0 | 66 | 0.2403 | {'precision': 0.9444444444444444, 'recall': 0.5862068965517241, 'f1': 0.7234042553191489, 'number': 116} | {'precision': 0.6428571428571429, 'recall': 0.8181818181818182, 'f1': 0.7200000000000001, 'number': 33} | {'precision': 0.8920704845814978, 'recall': 0.867237687366167, 'f1': 0.8794788273615636, 'number': 467} | {'precision': 0.3958333333333333, 'recall': 1.0, 'f1': 0.5671641791044776, 'number': 38} | {'precision': 0.8509316770186336, 'recall': 0.6884422110552764, 'f1': 0.7611111111111111, 'number': 199} | {'precision': 0.9152719665271967, 'recall': 0.9965831435079726, 'f1': 0.9541984732824427, 'number': 878} | {'precision': 0.8677248677248677, 'recall': 0.7663551401869159, 'f1': 0.8138957816377169, 'number': 214} | 0.8701 | 0.8812 | 0.8756 | 0.9496 |
66
+ | No log | 7.0 | 77 | 0.2435 | {'precision': 0.9444444444444444, 'recall': 0.5862068965517241, 'f1': 0.7234042553191489, 'number': 116} | {'precision': 0.627906976744186, 'recall': 0.8181818181818182, 'f1': 0.7105263157894738, 'number': 33} | {'precision': 0.8903508771929824, 'recall': 0.8693790149892934, 'f1': 0.8797399783315276, 'number': 467} | {'precision': 0.4444444444444444, 'recall': 0.9473684210526315, 'f1': 0.6050420168067226, 'number': 38} | {'precision': 0.8171428571428572, 'recall': 0.7185929648241206, 'f1': 0.764705882352941, 'number': 199} | {'precision': 0.9152719665271967, 'recall': 0.9965831435079726, 'f1': 0.9541984732824427, 'number': 878} | {'precision': 0.845771144278607, 'recall': 0.794392523364486, 'f1': 0.8192771084337348, 'number': 214} | 0.8695 | 0.8869 | 0.8781 | 0.9504 |
67
+ | No log | 8.0 | 88 | 0.2515 | {'precision': 0.9444444444444444, 'recall': 0.5862068965517241, 'f1': 0.7234042553191489, 'number': 116} | {'precision': 0.6136363636363636, 'recall': 0.8181818181818182, 'f1': 0.7012987012987013, 'number': 33} | {'precision': 0.9113636363636364, 'recall': 0.8586723768736617, 'f1': 0.884233737596472, 'number': 467} | {'precision': 0.4186046511627907, 'recall': 0.9473684210526315, 'f1': 0.5806451612903226, 'number': 38} | {'precision': 0.7923497267759563, 'recall': 0.7286432160804021, 'f1': 0.7591623036649214, 'number': 199} | {'precision': 0.9152719665271967, 'recall': 0.9965831435079726, 'f1': 0.9541984732824427, 'number': 878} | {'precision': 0.845, 'recall': 0.7897196261682243, 'f1': 0.8164251207729468, 'number': 214} | 0.8688 | 0.8848 | 0.8767 | 0.9501 |
68
+ | No log | 9.0 | 99 | 0.2548 | {'precision': 0.9444444444444444, 'recall': 0.5862068965517241, 'f1': 0.7234042553191489, 'number': 116} | {'precision': 0.627906976744186, 'recall': 0.8181818181818182, 'f1': 0.7105263157894738, 'number': 33} | {'precision': 0.9126436781609195, 'recall': 0.8501070663811563, 'f1': 0.8802660753880266, 'number': 467} | {'precision': 0.41379310344827586, 'recall': 0.9473684210526315, 'f1': 0.576, 'number': 38} | {'precision': 0.8021978021978022, 'recall': 0.7336683417085427, 'f1': 0.7664041994750656, 'number': 199} | {'precision': 0.9152719665271967, 'recall': 0.9965831435079726, 'f1': 0.9541984732824427, 'number': 878} | {'precision': 0.8520408163265306, 'recall': 0.780373831775701, 'f1': 0.8146341463414634, 'number': 214} | 0.8706 | 0.8823 | 0.8764 | 0.9502 |
69
+ | No log | 10.0 | 110 | 0.2534 | {'precision': 0.9444444444444444, 'recall': 0.5862068965517241, 'f1': 0.7234042553191489, 'number': 116} | {'precision': 0.627906976744186, 'recall': 0.8181818181818182, 'f1': 0.7105263157894738, 'number': 33} | {'precision': 0.9060402684563759, 'recall': 0.867237687366167, 'f1': 0.886214442013129, 'number': 467} | {'precision': 0.4044943820224719, 'recall': 0.9473684210526315, 'f1': 0.5669291338582677, 'number': 38} | {'precision': 0.8076923076923077, 'recall': 0.7386934673366834, 'f1': 0.7716535433070866, 'number': 199} | {'precision': 0.9152719665271967, 'recall': 0.9965831435079726, 'f1': 0.9541984732824427, 'number': 878} | {'precision': 0.8480392156862745, 'recall': 0.8084112149532711, 'f1': 0.8277511961722488, 'number': 214} | 0.8685 | 0.8900 | 0.8791 | 0.9507 |
70
+
71
+
72
+ ### Framework versions
73
+
74
+ - Transformers 4.44.0
75
+ - Pytorch 2.3.1+cu121
76
+ - Datasets 2.19.1
77
+ - Tokenizers 0.19.1
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:3fe132d3147a6a13b07639639c4c1bf83e91e9d2796b09cae7a99bd28e967894
3
  size 2235444656
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:856d9bc2346f10add2afe4a5e2bb66beebe5b0017f3d732d3099369e984e0d21
3
  size 2235444656