roberta-large-ner-ghtk-cs-add-label-new-data-3090-21Aug-2
This model is a fine-tuned version of Kudod/roberta-large-ner-ghtk-cs-6-label-old-data-3090-15Aug-2 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.2545
- Tk: {'precision': 0.9896907216494846, 'recall': 0.8275862068965517, 'f1': 0.9014084507042254, 'number': 116}
- Gày: {'precision': 0.5625, 'recall': 0.8181818181818182, 'f1': 0.6666666666666666, 'number': 33}
- Gày trừu tượng: {'precision': 0.896328293736501, 'recall': 0.8886509635974305, 'f1': 0.8924731182795699, 'number': 467}
- Iờ: {'precision': 0.5606060606060606, 'recall': 0.9736842105263158, 'f1': 0.7115384615384615, 'number': 38}
- Ã đơn: {'precision': 0.8009049773755657, 'recall': 0.8894472361809045, 'f1': 0.8428571428571429, 'number': 199}
- Đt: {'precision': 0.9435396308360477, 'recall': 0.989749430523918, 'f1': 0.9660922734852695, 'number': 878}
- Đt trừu tượng: {'precision': 0.7871485943775101, 'recall': 0.9158878504672897, 'f1': 0.8466522678185745, 'number': 214}
- Overall Precision: 0.8799
- Overall Recall: 0.9342
- Overall F1: 0.9062
- Overall Accuracy: 0.9619
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2.5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
Training Loss | Epoch | Step | Validation Loss | Tk | Gày | Gày trừu tượng | Iờ | Ã đơn | Đt | Đt trừu tượng | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 63 | 0.1429 | {'precision': 0.9368421052631579, 'recall': 0.7672413793103449, 'f1': 0.8436018957345971, 'number': 116} | {'precision': 0.5625, 'recall': 0.8181818181818182, 'f1': 0.6666666666666666, 'number': 33} | {'precision': 0.9023255813953488, 'recall': 0.8308351177730193, 'f1': 0.8651059085841695, 'number': 467} | {'precision': 0.5423728813559322, 'recall': 0.8421052631578947, 'f1': 0.6597938144329897, 'number': 38} | {'precision': 0.8246445497630331, 'recall': 0.8743718592964824, 'f1': 0.848780487804878, 'number': 199} | {'precision': 0.9419496166484118, 'recall': 0.979498861047836, 'f1': 0.9603573422668901, 'number': 878} | {'precision': 0.885, 'recall': 0.8271028037383178, 'f1': 0.855072463768116, 'number': 214} | 0.8931 | 0.8982 | 0.8957 | 0.9594 |
No log | 2.0 | 126 | 0.1873 | {'precision': 0.8559322033898306, 'recall': 0.8706896551724138, 'f1': 0.8632478632478633, 'number': 116} | {'precision': 0.6136363636363636, 'recall': 0.8181818181818182, 'f1': 0.7012987012987013, 'number': 33} | {'precision': 0.9308641975308642, 'recall': 0.8072805139186295, 'f1': 0.8646788990825689, 'number': 467} | {'precision': 0.5079365079365079, 'recall': 0.8421052631578947, 'f1': 0.6336633663366337, 'number': 38} | {'precision': 0.8, 'recall': 0.8844221105527639, 'f1': 0.8400954653937948, 'number': 199} | {'precision': 0.9375, 'recall': 0.9908883826879271, 'f1': 0.9634551495016611, 'number': 878} | {'precision': 0.8101851851851852, 'recall': 0.8177570093457944, 'f1': 0.813953488372093, 'number': 214} | 0.8816 | 0.9039 | 0.8926 | 0.9577 |
No log | 3.0 | 189 | 0.2133 | {'precision': 0.8244274809160306, 'recall': 0.9310344827586207, 'f1': 0.8744939271255061, 'number': 116} | {'precision': 0.45901639344262296, 'recall': 0.8484848484848485, 'f1': 0.5957446808510639, 'number': 33} | {'precision': 0.8928571428571429, 'recall': 0.8565310492505354, 'f1': 0.8743169398907104, 'number': 467} | {'precision': 0.4864864864864865, 'recall': 0.9473684210526315, 'f1': 0.6428571428571428, 'number': 38} | {'precision': 0.8046511627906977, 'recall': 0.8693467336683417, 'f1': 0.8357487922705314, 'number': 199} | {'precision': 0.9704545454545455, 'recall': 0.9726651480637813, 'f1': 0.9715585893060297, 'number': 878} | {'precision': 0.641566265060241, 'recall': 0.9953271028037384, 'f1': 0.7802197802197801, 'number': 214} | 0.8463 | 0.9316 | 0.8869 | 0.9577 |
No log | 4.0 | 252 | 0.2183 | {'precision': 0.8512396694214877, 'recall': 0.8879310344827587, 'f1': 0.869198312236287, 'number': 116} | {'precision': 0.5185185185185185, 'recall': 0.8484848484848485, 'f1': 0.6436781609195402, 'number': 33} | {'precision': 0.9021739130434783, 'recall': 0.8886509635974305, 'f1': 0.8953613807982741, 'number': 467} | {'precision': 0.5362318840579711, 'recall': 0.9736842105263158, 'f1': 0.6915887850467289, 'number': 38} | {'precision': 0.8341463414634146, 'recall': 0.8592964824120602, 'f1': 0.8465346534653465, 'number': 199} | {'precision': 0.9728813559322034, 'recall': 0.9806378132118451, 'f1': 0.9767441860465115, 'number': 878} | {'precision': 0.7741935483870968, 'recall': 0.897196261682243, 'f1': 0.8311688311688312, 'number': 214} | 0.8849 | 0.9290 | 0.9064 | 0.9621 |
No log | 5.0 | 315 | 0.2603 | {'precision': 0.6530612244897959, 'recall': 0.8275862068965517, 'f1': 0.7300380228136881, 'number': 116} | {'precision': 0.5686274509803921, 'recall': 0.8787878787878788, 'f1': 0.6904761904761905, 'number': 33} | {'precision': 0.8930817610062893, 'recall': 0.9122055674518201, 'f1': 0.902542372881356, 'number': 467} | {'precision': 0.5441176470588235, 'recall': 0.9736842105263158, 'f1': 0.6981132075471699, 'number': 38} | {'precision': 0.7953488372093023, 'recall': 0.8592964824120602, 'f1': 0.8260869565217389, 'number': 199} | {'precision': 0.9248677248677248, 'recall': 0.9954441913439636, 'f1': 0.9588590235874931, 'number': 878} | {'precision': 0.8135593220338984, 'recall': 0.897196261682243, 'f1': 0.8533333333333333, 'number': 214} | 0.8532 | 0.9383 | 0.8937 | 0.9579 |
No log | 6.0 | 378 | 0.2367 | {'precision': 0.8421052631578947, 'recall': 0.8275862068965517, 'f1': 0.8347826086956522, 'number': 116} | {'precision': 0.5714285714285714, 'recall': 0.8484848484848485, 'f1': 0.6829268292682927, 'number': 33} | {'precision': 0.9004329004329005, 'recall': 0.8907922912205567, 'f1': 0.8955866523143164, 'number': 467} | {'precision': 0.5538461538461539, 'recall': 0.9473684210526315, 'f1': 0.6990291262135921, 'number': 38} | {'precision': 0.8349514563106796, 'recall': 0.864321608040201, 'f1': 0.8493827160493828, 'number': 199} | {'precision': 0.9495614035087719, 'recall': 0.9863325740318907, 'f1': 0.9675977653631285, 'number': 878} | {'precision': 0.7677165354330708, 'recall': 0.9112149532710281, 'f1': 0.8333333333333334, 'number': 214} | 0.8773 | 0.9301 | 0.9029 | 0.9621 |
No log | 7.0 | 441 | 0.2529 | {'precision': 0.9142857142857143, 'recall': 0.8275862068965517, 'f1': 0.8687782805429863, 'number': 116} | {'precision': 0.6, 'recall': 0.8181818181818182, 'f1': 0.6923076923076923, 'number': 33} | {'precision': 0.9, 'recall': 0.8865096359743041, 'f1': 0.8932038834951457, 'number': 467} | {'precision': 0.5068493150684932, 'recall': 0.9736842105263158, 'f1': 0.6666666666666667, 'number': 38} | {'precision': 0.7853881278538812, 'recall': 0.864321608040201, 'f1': 0.8229665071770335, 'number': 199} | {'precision': 0.9546460176991151, 'recall': 0.9829157175398633, 'f1': 0.9685746352413019, 'number': 878} | {'precision': 0.7626459143968871, 'recall': 0.9158878504672897, 'f1': 0.832271762208068, 'number': 214} | 0.8749 | 0.9280 | 0.9007 | 0.9610 |
0.0487 | 8.0 | 504 | 0.2506 | {'precision': 0.9795918367346939, 'recall': 0.8275862068965517, 'f1': 0.897196261682243, 'number': 116} | {'precision': 0.574468085106383, 'recall': 0.8181818181818182, 'f1': 0.675, 'number': 33} | {'precision': 0.8991228070175439, 'recall': 0.8779443254817987, 'f1': 0.8884073672806067, 'number': 467} | {'precision': 0.5294117647058824, 'recall': 0.9473684210526315, 'f1': 0.679245283018868, 'number': 38} | {'precision': 0.7981651376146789, 'recall': 0.8743718592964824, 'f1': 0.8345323741007195, 'number': 199} | {'precision': 0.9547960308710033, 'recall': 0.9863325740318907, 'f1': 0.9703081232492997, 'number': 878} | {'precision': 0.7871485943775101, 'recall': 0.9158878504672897, 'f1': 0.8466522678185745, 'number': 214} | 0.8835 | 0.9280 | 0.9052 | 0.9624 |
0.0487 | 9.0 | 567 | 0.2546 | {'precision': 0.9896907216494846, 'recall': 0.8275862068965517, 'f1': 0.9014084507042254, 'number': 116} | {'precision': 0.5625, 'recall': 0.8181818181818182, 'f1': 0.6666666666666666, 'number': 33} | {'precision': 0.9, 'recall': 0.8865096359743041, 'f1': 0.8932038834951457, 'number': 467} | {'precision': 0.5692307692307692, 'recall': 0.9736842105263158, 'f1': 0.7184466019417477, 'number': 38} | {'precision': 0.7729257641921398, 'recall': 0.8894472361809045, 'f1': 0.8271028037383178, 'number': 199} | {'precision': 0.9435396308360477, 'recall': 0.989749430523918, 'f1': 0.9660922734852695, 'number': 878} | {'precision': 0.7862903225806451, 'recall': 0.9112149532710281, 'f1': 0.8441558441558442, 'number': 214} | 0.8777 | 0.9332 | 0.9046 | 0.9613 |
0.0487 | 10.0 | 630 | 0.2545 | {'precision': 0.9896907216494846, 'recall': 0.8275862068965517, 'f1': 0.9014084507042254, 'number': 116} | {'precision': 0.5625, 'recall': 0.8181818181818182, 'f1': 0.6666666666666666, 'number': 33} | {'precision': 0.896328293736501, 'recall': 0.8886509635974305, 'f1': 0.8924731182795699, 'number': 467} | {'precision': 0.5606060606060606, 'recall': 0.9736842105263158, 'f1': 0.7115384615384615, 'number': 38} | {'precision': 0.8009049773755657, 'recall': 0.8894472361809045, 'f1': 0.8428571428571429, 'number': 199} | {'precision': 0.9435396308360477, 'recall': 0.989749430523918, 'f1': 0.9660922734852695, 'number': 878} | {'precision': 0.7871485943775101, 'recall': 0.9158878504672897, 'f1': 0.8466522678185745, 'number': 214} | 0.8799 | 0.9342 | 0.9062 | 0.9619 |
Framework versions
- Transformers 4.44.0
- Pytorch 2.3.1+cu121
- Datasets 2.19.1
- Tokenizers 0.19.1
- Downloads last month
- -
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for Kudod/roberta-large-ner-ghtk-cs-add-label-new-data-3090-21Aug-2
Base model
FacebookAI/xlm-roberta-large