Saumya-Mundra's picture
End of training
8de1ac5 verified
|
raw
history blame
4.29 kB
metadata
library_name: transformers
license: other
base_model: nvidia/mit-b1
tags:
  - image-segmentation
  - vision
  - generated_from_trainer
model-index:
  - name: segformer-finetuned-tt-2k-b1
    results: []

segformer-finetuned-tt-2k-b1

This model is a fine-tuned version of nvidia/mit-b1 on the Saumya-Mundra/text255 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0929
  • Mean Iou: 0.4897
  • Mean Accuracy: 0.9793
  • Overall Accuracy: 0.9793
  • Accuracy Text: nan
  • Accuracy No Text: 0.9793
  • Iou Text: 0.0
  • Iou No Text: 0.9793

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-07
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 1337
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: polynomial
  • training_steps: 2000

Training results

Training Loss Epoch Step Accuracy No Text Accuracy Text Iou No Text Iou Text Validation Loss Mean Accuracy Mean Iou Overall Accuracy
0.3305 1.0 125 0.9586 nan 0.9586 0.0 0.1846 0.9586 0.4793 0.9586
0.2037 2.0 250 0.9706 nan 0.9706 0.0 0.1322 0.9706 0.4853 0.9706
0.1534 3.0 375 0.9784 nan 0.9784 0.0 0.1074 0.9784 0.4892 0.9784
0.1313 4.0 500 0.9839 nan 0.9839 0.0 0.0976 0.9839 0.4920 0.9839
0.1156 5.0 625 0.9799 nan 0.9799 0.0 0.1001 0.9799 0.4900 0.9799
0.1123 6.0 750 0.9866 nan 0.9866 0.0 0.0920 0.9866 0.4933 0.9866
0.108 7.0 875 0.9815 nan 0.9815 0.0 0.0946 0.9815 0.4908 0.9815
0.1017 8.0 1000 0.9805 nan 0.9805 0.0 0.0943 0.9805 0.4903 0.9805
0.0994 9.0 1125 0.9808 nan 0.9808 0.0 0.0927 0.9808 0.4904 0.9808
0.0926 10.0 1250 0.9783 nan 0.9783 0.0 0.0957 0.9783 0.4891 0.9783
0.0907 11.0 1375 0.9830 nan 0.9830 0.0 0.0913 0.9830 0.4915 0.9830
0.0893 12.0 1500 0.9838 nan 0.9838 0.0 0.0893 0.9838 0.4919 0.9838
0.0853 13.0 1625 0.9804 nan 0.9804 0.0 0.0913 0.9804 0.4902 0.9804
0.0834 14.0 1750 0.9820 nan 0.9820 0.0 0.0899 0.9820 0.4910 0.9820
0.0861 15.0 1875 0.9815 nan 0.9815 0.0 0.0902 0.9815 0.4907 0.9815
0.0803 16.0 2000 0.9793 nan 0.9793 0.0 0.0929 0.9793 0.4897 0.9793

Framework versions

  • Transformers 4.49.0.dev0
  • Pytorch 2.5.1+cu124
  • Datasets 3.2.0
  • Tokenizers 0.21.0