roberta-large-ner-ghtk-cs-add-label-new-data-3090-15Aug-3
This model is a fine-tuned version of Kudod/roberta-large-ner-ghtk-cs-6-label-old-data-3090-15Aug-2 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.2534
- Tk: {'precision': 0.9444444444444444, 'recall': 0.5862068965517241, 'f1': 0.7234042553191489, 'number': 116}
- Gày: {'precision': 0.627906976744186, 'recall': 0.8181818181818182, 'f1': 0.7105263157894738, 'number': 33}
- Gày trừu tượng: {'precision': 0.9060402684563759, 'recall': 0.867237687366167, 'f1': 0.886214442013129, 'number': 467}
- Iờ: {'precision': 0.4044943820224719, 'recall': 0.9473684210526315, 'f1': 0.5669291338582677, 'number': 38}
- Ã đơn: {'precision': 0.8076923076923077, 'recall': 0.7386934673366834, 'f1': 0.7716535433070866, 'number': 199}
- Đt: {'precision': 0.9152719665271967, 'recall': 0.9965831435079726, 'f1': 0.9541984732824427, 'number': 878}
- Đt trừu tượng: {'precision': 0.8480392156862745, 'recall': 0.8084112149532711, 'f1': 0.8277511961722488, 'number': 214}
- Overall Precision: 0.8685
- Overall Recall: 0.8900
- Overall F1: 0.8791
- Overall Accuracy: 0.9507
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2.5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
Training Loss | Epoch | Step | Validation Loss | Tk | Gày | Gày trừu tượng | Iờ | Ã đơn | Đt | Đt trừu tượng | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 11 | 0.4222 | {'precision': 1.0, 'recall': 0.02586206896551724, 'f1': 0.050420168067226885, 'number': 116} | {'precision': 0.5, 'recall': 0.7575757575757576, 'f1': 0.6024096385542169, 'number': 33} | {'precision': 0.9420654911838791, 'recall': 0.8008565310492506, 'f1': 0.8657407407407407, 'number': 467} | {'precision': 0.08083832335329341, 'recall': 0.7105263157894737, 'f1': 0.14516129032258066, 'number': 38} | {'precision': 0.32, 'recall': 0.04020100502512563, 'f1': 0.07142857142857142, 'number': 199} | {'precision': 0.6460244648318043, 'recall': 0.9624145785876993, 'f1': 0.7731015553522415, 'number': 878} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 214} | 0.5873 | 0.6591 | 0.6211 | 0.8951 |
No log | 2.0 | 22 | 0.2526 | {'precision': 1.0, 'recall': 0.1810344827586207, 'f1': 0.3065693430656934, 'number': 116} | {'precision': 0.5869565217391305, 'recall': 0.8181818181818182, 'f1': 0.6835443037974683, 'number': 33} | {'precision': 0.9214780600461894, 'recall': 0.854389721627409, 'f1': 0.8866666666666667, 'number': 467} | {'precision': 0.20710059171597633, 'recall': 0.9210526315789473, 'f1': 0.33816425120772947, 'number': 38} | {'precision': 0.780952380952381, 'recall': 0.8241206030150754, 'f1': 0.801955990220049, 'number': 199} | {'precision': 0.9180851063829787, 'recall': 0.9829157175398633, 'f1': 0.9493949394939494, 'number': 878} | {'precision': 1.0, 'recall': 0.04672897196261682, 'f1': 0.08928571428571429, 'number': 214} | 0.8305 | 0.7810 | 0.8050 | 0.9417 |
No log | 3.0 | 33 | 0.2080 | {'precision': 0.9538461538461539, 'recall': 0.5344827586206896, 'f1': 0.6850828729281768, 'number': 116} | {'precision': 0.6944444444444444, 'recall': 0.7575757575757576, 'f1': 0.7246376811594203, 'number': 33} | {'precision': 0.910913140311804, 'recall': 0.8758029978586723, 'f1': 0.8930131004366811, 'number': 467} | {'precision': 0.25757575757575757, 'recall': 0.8947368421052632, 'f1': 0.4, 'number': 38} | {'precision': 0.8133971291866029, 'recall': 0.8542713567839196, 'f1': 0.8333333333333334, 'number': 199} | {'precision': 0.9142857142857143, 'recall': 0.9840546697038725, 'f1': 0.947888096544158, 'number': 878} | {'precision': 0.9649122807017544, 'recall': 0.514018691588785, 'f1': 0.6707317073170732, 'number': 214} | 0.8585 | 0.8607 | 0.8596 | 0.9534 |
No log | 4.0 | 44 | 0.2246 | {'precision': 0.9444444444444444, 'recall': 0.5862068965517241, 'f1': 0.7234042553191489, 'number': 116} | {'precision': 0.5869565217391305, 'recall': 0.8181818181818182, 'f1': 0.6835443037974683, 'number': 33} | {'precision': 0.9151376146788991, 'recall': 0.854389721627409, 'f1': 0.883720930232558, 'number': 467} | {'precision': 0.32710280373831774, 'recall': 0.9210526315789473, 'f1': 0.4827586206896552, 'number': 38} | {'precision': 0.7637130801687764, 'recall': 0.9095477386934674, 'f1': 0.8302752293577983, 'number': 199} | {'precision': 0.9135932560590094, 'recall': 0.9874715261958997, 'f1': 0.9490968801313628, 'number': 878} | {'precision': 0.9354838709677419, 'recall': 0.677570093457944, 'f1': 0.7859078590785908, 'number': 214} | 0.8601 | 0.8853 | 0.8726 | 0.9528 |
No log | 5.0 | 55 | 0.2362 | {'precision': 0.9577464788732394, 'recall': 0.5862068965517241, 'f1': 0.7272727272727272, 'number': 116} | {'precision': 0.675, 'recall': 0.8181818181818182, 'f1': 0.7397260273972603, 'number': 33} | {'precision': 0.940149625935162, 'recall': 0.8072805139186295, 'f1': 0.868663594470046, 'number': 467} | {'precision': 0.3894736842105263, 'recall': 0.9736842105263158, 'f1': 0.556390977443609, 'number': 38} | {'precision': 0.8042328042328042, 'recall': 0.7638190954773869, 'f1': 0.7835051546391751, 'number': 199} | {'precision': 0.9150052465897167, 'recall': 0.9931662870159453, 'f1': 0.9524849808847623, 'number': 878} | {'precision': 0.9367088607594937, 'recall': 0.6915887850467289, 'f1': 0.7956989247311828, 'number': 214} | 0.8815 | 0.8643 | 0.8728 | 0.9510 |
No log | 6.0 | 66 | 0.2403 | {'precision': 0.9444444444444444, 'recall': 0.5862068965517241, 'f1': 0.7234042553191489, 'number': 116} | {'precision': 0.6428571428571429, 'recall': 0.8181818181818182, 'f1': 0.7200000000000001, 'number': 33} | {'precision': 0.8920704845814978, 'recall': 0.867237687366167, 'f1': 0.8794788273615636, 'number': 467} | {'precision': 0.3958333333333333, 'recall': 1.0, 'f1': 0.5671641791044776, 'number': 38} | {'precision': 0.8509316770186336, 'recall': 0.6884422110552764, 'f1': 0.7611111111111111, 'number': 199} | {'precision': 0.9152719665271967, 'recall': 0.9965831435079726, 'f1': 0.9541984732824427, 'number': 878} | {'precision': 0.8677248677248677, 'recall': 0.7663551401869159, 'f1': 0.8138957816377169, 'number': 214} | 0.8701 | 0.8812 | 0.8756 | 0.9496 |
No log | 7.0 | 77 | 0.2435 | {'precision': 0.9444444444444444, 'recall': 0.5862068965517241, 'f1': 0.7234042553191489, 'number': 116} | {'precision': 0.627906976744186, 'recall': 0.8181818181818182, 'f1': 0.7105263157894738, 'number': 33} | {'precision': 0.8903508771929824, 'recall': 0.8693790149892934, 'f1': 0.8797399783315276, 'number': 467} | {'precision': 0.4444444444444444, 'recall': 0.9473684210526315, 'f1': 0.6050420168067226, 'number': 38} | {'precision': 0.8171428571428572, 'recall': 0.7185929648241206, 'f1': 0.764705882352941, 'number': 199} | {'precision': 0.9152719665271967, 'recall': 0.9965831435079726, 'f1': 0.9541984732824427, 'number': 878} | {'precision': 0.845771144278607, 'recall': 0.794392523364486, 'f1': 0.8192771084337348, 'number': 214} | 0.8695 | 0.8869 | 0.8781 | 0.9504 |
No log | 8.0 | 88 | 0.2515 | {'precision': 0.9444444444444444, 'recall': 0.5862068965517241, 'f1': 0.7234042553191489, 'number': 116} | {'precision': 0.6136363636363636, 'recall': 0.8181818181818182, 'f1': 0.7012987012987013, 'number': 33} | {'precision': 0.9113636363636364, 'recall': 0.8586723768736617, 'f1': 0.884233737596472, 'number': 467} | {'precision': 0.4186046511627907, 'recall': 0.9473684210526315, 'f1': 0.5806451612903226, 'number': 38} | {'precision': 0.7923497267759563, 'recall': 0.7286432160804021, 'f1': 0.7591623036649214, 'number': 199} | {'precision': 0.9152719665271967, 'recall': 0.9965831435079726, 'f1': 0.9541984732824427, 'number': 878} | {'precision': 0.845, 'recall': 0.7897196261682243, 'f1': 0.8164251207729468, 'number': 214} | 0.8688 | 0.8848 | 0.8767 | 0.9501 |
No log | 9.0 | 99 | 0.2548 | {'precision': 0.9444444444444444, 'recall': 0.5862068965517241, 'f1': 0.7234042553191489, 'number': 116} | {'precision': 0.627906976744186, 'recall': 0.8181818181818182, 'f1': 0.7105263157894738, 'number': 33} | {'precision': 0.9126436781609195, 'recall': 0.8501070663811563, 'f1': 0.8802660753880266, 'number': 467} | {'precision': 0.41379310344827586, 'recall': 0.9473684210526315, 'f1': 0.576, 'number': 38} | {'precision': 0.8021978021978022, 'recall': 0.7336683417085427, 'f1': 0.7664041994750656, 'number': 199} | {'precision': 0.9152719665271967, 'recall': 0.9965831435079726, 'f1': 0.9541984732824427, 'number': 878} | {'precision': 0.8520408163265306, 'recall': 0.780373831775701, 'f1': 0.8146341463414634, 'number': 214} | 0.8706 | 0.8823 | 0.8764 | 0.9502 |
No log | 10.0 | 110 | 0.2534 | {'precision': 0.9444444444444444, 'recall': 0.5862068965517241, 'f1': 0.7234042553191489, 'number': 116} | {'precision': 0.627906976744186, 'recall': 0.8181818181818182, 'f1': 0.7105263157894738, 'number': 33} | {'precision': 0.9060402684563759, 'recall': 0.867237687366167, 'f1': 0.886214442013129, 'number': 467} | {'precision': 0.4044943820224719, 'recall': 0.9473684210526315, 'f1': 0.5669291338582677, 'number': 38} | {'precision': 0.8076923076923077, 'recall': 0.7386934673366834, 'f1': 0.7716535433070866, 'number': 199} | {'precision': 0.9152719665271967, 'recall': 0.9965831435079726, 'f1': 0.9541984732824427, 'number': 878} | {'precision': 0.8480392156862745, 'recall': 0.8084112149532711, 'f1': 0.8277511961722488, 'number': 214} | 0.8685 | 0.8900 | 0.8791 | 0.9507 |
Framework versions
- Transformers 4.44.0
- Pytorch 2.3.1+cu121
- Datasets 2.19.1
- Tokenizers 0.19.1
- Downloads last month
- -
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for Kudod/roberta-large-ner-ghtk-cs-add-label-new-data-3090-15Aug-3
Base model
FacebookAI/xlm-roberta-large