swinv2-small-patch4-window16-256

This model is a fine-tuned version of microsoft/swinv2-small-patch4-window16-256 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2761
  • Accuracy: 1.0
  • Precision: 1.0
  • Sensitivity: 1.0
  • Specificity: 1.0
  • F1: 1.0
  • Auc: 1.0
  • Mcc: 1.0
  • J Stat: 1.0
  • Confusion Matrix: [[150, 0], [0, 30]]

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6.197470303061579e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.006309410963953089
  • num_epochs: 10
  • label_smoothing_factor: 0.15696760193504863

Training results

Training Loss Epoch Step Validation Loss Accuracy Precision Sensitivity Specificity F1 Auc Mcc J Stat Confusion Matrix
0.3854 1.0 188 0.3930 0.9261 0.9342 0.9462 0.9191 0.9279 0.9713 0.8235 0.8652 [[1022, 90], [21, 369]]
0.3667 2.0 376 0.3353 0.9660 0.9665 0.8872 0.9937 0.9655 0.9881 0.9109 0.8809 [[1105, 7], [44, 346]]
0.3275 3.0 564 0.3080 0.9827 0.9827 0.9513 0.9937 0.9826 0.9939 0.9547 0.9450 [[1105, 7], [19, 371]]
0.3221 4.0 752 0.3034 0.9847 0.9848 0.9487 0.9973 0.9846 0.9963 0.9600 0.9460 [[1109, 3], [20, 370]]
0.3039 5.0 940 0.2990 0.9860 0.9860 0.9590 0.9955 0.9860 0.9985 0.9635 0.9545 [[1107, 5], [16, 374]]
0.3032 6.0 1128 0.2912 0.9907 0.9907 0.9744 0.9964 0.9907 0.9995 0.9757 0.9708 [[1108, 4], [10, 380]]
0.2804 7.0 1316 0.2882 0.9913 0.9914 0.9744 0.9973 0.9913 0.9999 0.9774 0.9717 [[1109, 3], [10, 380]]
0.2941 8.0 1504 0.2824 0.9947 0.9947 0.9821 0.9991 0.9947 0.9999 0.9861 0.9812 [[1111, 1], [7, 383]]
0.2904 9.0 1692 0.2797 0.9960 0.9960 0.9872 0.9991 0.9960 1.0000 0.9896 0.9863 [[1111, 1], [5, 385]]
0.2754 10.0 1880 0.2788 0.9967 0.9967 0.9872 1.0 0.9967 1.0000 0.9913 0.9872 [[1112, 0], [5, 385]]

Framework versions

  • Transformers 4.52.4
  • Pytorch 2.6.0+cu124
  • Datasets 4.0.0
  • Tokenizers 0.21.2
Downloads last month
6
Safetensors
Model size
49M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for zypchn/swinv2-small-patch4-window16-256

Finetuned
(4)
this model