hushem_5x_deit_tiny_sgd_00001_fold4

This model is a fine-tuned version of facebook/deit-tiny-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.6392
  • Accuracy: 0.2857

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.5126 1.0 28 1.6960 0.2857
1.4875 2.0 56 1.6934 0.2857
1.484 3.0 84 1.6908 0.2857
1.5224 4.0 112 1.6883 0.2857
1.464 5.0 140 1.6861 0.2857
1.514 6.0 168 1.6836 0.2857
1.4795 7.0 196 1.6813 0.2857
1.4849 8.0 224 1.6790 0.2857
1.4832 9.0 252 1.6769 0.2857
1.5231 10.0 280 1.6747 0.2857
1.5146 11.0 308 1.6728 0.2857
1.4709 12.0 336 1.6708 0.2857
1.5002 13.0 364 1.6689 0.2857
1.4731 14.0 392 1.6671 0.2857
1.4733 15.0 420 1.6654 0.2857
1.4938 16.0 448 1.6637 0.2857
1.517 17.0 476 1.6621 0.2857
1.4904 18.0 504 1.6604 0.2857
1.4813 19.0 532 1.6589 0.2857
1.4788 20.0 560 1.6576 0.2857
1.476 21.0 588 1.6562 0.2857
1.5095 22.0 616 1.6550 0.2857
1.4801 23.0 644 1.6537 0.2857
1.4778 24.0 672 1.6525 0.2857
1.4526 25.0 700 1.6513 0.2857
1.4781 26.0 728 1.6502 0.2857
1.4923 27.0 756 1.6492 0.2857
1.4977 28.0 784 1.6482 0.2857
1.4558 29.0 812 1.6472 0.2857
1.4785 30.0 840 1.6464 0.2857
1.4962 31.0 868 1.6455 0.2857
1.4638 32.0 896 1.6447 0.2857
1.5095 33.0 924 1.6440 0.2857
1.4775 34.0 952 1.6433 0.2857
1.4595 35.0 980 1.6427 0.2857
1.4656 36.0 1008 1.6421 0.2857
1.4403 37.0 1036 1.6416 0.2857
1.4824 38.0 1064 1.6412 0.2857
1.5225 39.0 1092 1.6408 0.2857
1.4471 40.0 1120 1.6404 0.2857
1.4797 41.0 1148 1.6401 0.2857
1.4455 42.0 1176 1.6399 0.2857
1.4823 43.0 1204 1.6397 0.2857
1.4547 44.0 1232 1.6395 0.2857
1.4676 45.0 1260 1.6394 0.2857
1.4731 46.0 1288 1.6393 0.2857
1.4786 47.0 1316 1.6393 0.2857
1.4805 48.0 1344 1.6392 0.2857
1.5172 49.0 1372 1.6392 0.2857
1.4525 50.0 1400 1.6392 0.2857

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
27
Safetensors
Model size
5.53M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for hkivancoral/hushem_5x_deit_tiny_sgd_00001_fold4

Finetuned
(354)
this model

Evaluation results