ysenarath's picture
Upload tokenizer
d7a9c37 verified
metadata
library_name: transformers
license: mit
base_model: FacebookAI/roberta-base
tags:
  - generated_from_trainer
datasets:
  - sentiment140
metrics:
  - accuracy
  - precision
  - recall
  - f1
model-index:
  - name: roberta-base-sentiment140
    results:
      - task:
          type: text-classification
          name: Text Classification
        dataset:
          name: sentiment140
          type: sentiment140
          config: sentiment140
          split: train
          args: sentiment140
        metrics:
          - type: accuracy
            value: 0.883
            name: Accuracy
          - type: precision
            value: 0.8801652892561983
            name: Precision
          - type: recall
            value: 0.8783505154639175
            name: Recall
          - type: f1
            value: 0.8792569659442725
            name: F1

roberta-base-sentiment140

This model is a fine-tuned version of FacebookAI/roberta-base on the sentiment140 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3988
  • Accuracy: 0.883
  • Roc Auc: 0.9515
  • Precision: 0.8802
  • Recall: 0.8784
  • F1: 0.8793

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Accuracy Roc Auc Precision Recall F1
0.2864 1.0 49969 0.3030 0.777 0.9470 0.6921 0.9732 0.8089
0.255 2.0 99938 0.2872 0.885 0.9553 0.8585 0.9134 0.8851
0.239 3.0 149907 0.2921 0.881 0.9543 0.8690 0.8887 0.8787
0.2042 4.0 199876 0.3028 0.891 0.9549 0.8821 0.8948 0.8884
0.187 5.0 249845 0.3192 0.89 0.9536 0.8788 0.8969 0.8878
0.1606 6.0 299814 0.3670 0.885 0.9514 0.8715 0.8948 0.8830
0.1343 7.0 349783 0.3988 0.883 0.9515 0.8802 0.8784 0.8793

Framework versions

  • Transformers 4.49.0
  • Pytorch 2.6.0+cu124
  • Datasets 3.3.2
  • Tokenizers 0.21.0