metadata
metrics:
- f1
language:
- en
datasets:
- argilla/twitter-coronavirus
tags:
- generated_from_trainer
The following hyperparameters were used during training:
- learning_rate: 5e-5
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
- mixed_precision_training: Native AMP
- warmup_ratio: 0.1
- weight_decay=1e-2
Training results
| Training Loss | Epoch | Validation Loss | F1 | F1 Macro |
|---|---|---|---|---|
| 1.3957 | 1.0 | 1.0134 | 0.242860 | 0.124580 |
| 0.8715 | 2.0 | 0.6892 | 0.243673 | 0.113322 |
| 0.6085 | 3.0 | 0.4943 | 0.319262 | 0.191744 |
| 0.6085 | 3.0 | 0.4943 | 0.319262 | 0.191744 |
| 0.6085 | 3.0 | 0.4943 | 0.319262 | 0.191744 |