smids_3x_deit_tiny_sgd_001_fold4

This model is a fine-tuned version of facebook/deit-tiny-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3792
  • Accuracy: 0.845

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.9374 1.0 225 0.9137 0.535
0.637 2.0 450 0.7038 0.6933
0.528 3.0 675 0.5838 0.745
0.4807 4.0 900 0.5174 0.7967
0.4652 5.0 1125 0.4857 0.8033
0.3253 6.0 1350 0.4581 0.8083
0.3966 7.0 1575 0.4373 0.8233
0.4286 8.0 1800 0.4298 0.81
0.368 9.0 2025 0.4134 0.8233
0.3027 10.0 2250 0.4061 0.8417
0.3611 11.0 2475 0.4002 0.8233
0.3327 12.0 2700 0.3930 0.8483
0.2879 13.0 2925 0.3875 0.835
0.2362 14.0 3150 0.3833 0.845
0.2472 15.0 3375 0.3854 0.85
0.2412 16.0 3600 0.3772 0.8483
0.2474 17.0 3825 0.3765 0.8433
0.2597 18.0 4050 0.3736 0.8467
0.2836 19.0 4275 0.3755 0.85
0.19 20.0 4500 0.3722 0.8467
0.2408 21.0 4725 0.3732 0.8483
0.2395 22.0 4950 0.3728 0.8433
0.2175 23.0 5175 0.3773 0.8417
0.2295 24.0 5400 0.3700 0.8517
0.2109 25.0 5625 0.3716 0.8467
0.1922 26.0 5850 0.3714 0.8517
0.1883 27.0 6075 0.3724 0.8517
0.1939 28.0 6300 0.3739 0.85
0.1633 29.0 6525 0.3736 0.8567
0.1934 30.0 6750 0.3747 0.8533
0.1648 31.0 6975 0.3737 0.8467
0.1378 32.0 7200 0.3775 0.845
0.2214 33.0 7425 0.3749 0.85
0.1912 34.0 7650 0.3749 0.845
0.1512 35.0 7875 0.3732 0.8483
0.2352 36.0 8100 0.3763 0.845
0.2012 37.0 8325 0.3753 0.845
0.1968 38.0 8550 0.3756 0.85
0.1845 39.0 8775 0.3769 0.8483
0.215 40.0 9000 0.3781 0.8483
0.1756 41.0 9225 0.3779 0.8483
0.1667 42.0 9450 0.3792 0.8467
0.1672 43.0 9675 0.3791 0.8467
0.2264 44.0 9900 0.3791 0.8467
0.1992 45.0 10125 0.3783 0.8483
0.1804 46.0 10350 0.3789 0.8483
0.177 47.0 10575 0.3795 0.8483
0.2026 48.0 10800 0.3789 0.8467
0.1973 49.0 11025 0.3793 0.8483
0.1685 50.0 11250 0.3792 0.845

Framework versions

  • Transformers 4.32.1
  • Pytorch 2.1.1+cu121
  • Datasets 2.12.0
  • Tokenizers 0.13.2
Downloads last month
67
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for hkivancoral/smids_3x_deit_tiny_sgd_001_fold4

Finetuned
(354)
this model

Evaluation results