5c_4

This model is a fine-tuned version of MCG-NJU/videomae-large-finetuned-kinetics on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 4.4509
  • Accuracy: 0.48

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 1
  • eval_batch_size: 1
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 23400

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.9641 0.01 234 1.4838 0.4
1.5439 1.01 468 3.7125 0.4
1.2944 2.01 702 3.6749 0.4
0.9419 3.01 936 3.0422 0.4
2.4333 4.01 1170 2.6803 0.4
1.4646 5.01 1404 3.5355 0.4
2.1201 6.01 1638 3.0479 0.4
2.9021 7.01 1872 2.8181 0.4
2.1527 8.01 2106 2.7605 0.4
1.9428 9.01 2340 2.4513 0.4
1.6949 10.01 2574 3.2310 0.4
0.7839 11.01 2808 3.2372 0.4
0.3228 12.01 3042 4.4588 0.4
3.5377 13.01 3276 2.8621 0.4
0.509 14.01 3510 2.7460 0.4
0.1437 15.01 3744 2.9698 0.4
1.0039 16.01 3978 1.9415 0.44
0.0062 17.01 4212 3.7041 0.4
0.6038 18.01 4446 3.2141 0.4
1.1687 19.01 4680 2.4072 0.44
0.8397 20.01 4914 3.4212 0.4
1.1147 21.01 5148 2.5115 0.44
0.2286 22.01 5382 2.4343 0.44
0.8939 23.01 5616 3.0712 0.4
0.3871 24.01 5850 3.2394 0.4
0.3649 25.01 6084 3.9466 0.44
1.2601 26.01 6318 2.9586 0.44
0.852 27.01 6552 4.6464 0.4
0.6269 28.01 6786 3.1292 0.44
1.0013 29.01 7020 4.6319 0.4
0.02 30.01 7254 4.2514 0.4
0.1333 31.01 7488 4.3310 0.4
0.0005 32.01 7722 4.5354 0.4
0.004 33.01 7956 4.5970 0.4
0.3017 34.01 8190 4.5879 0.44
0.2014 35.01 8424 4.2809 0.4
0.1573 36.01 8658 4.6822 0.44
0.0041 37.01 8892 5.1673 0.4
0.0001 38.01 9126 5.4005 0.4
0.1066 39.01 9360 4.4509 0.48
0.0001 40.01 9594 5.0906 0.44
1.3235 41.01 9828 4.4093 0.48
0.4313 42.01 10062 4.0898 0.48
0.0002 43.01 10296 4.7817 0.44
0.0001 44.01 10530 4.8667 0.48
0.0007 45.01 10764 4.5619 0.48
0.0009 46.01 10998 5.0250 0.44
0.0001 47.01 11232 4.4129 0.48
0.0001 48.01 11466 5.5987 0.44
0.0003 49.01 11700 5.4567 0.44
0.0468 50.01 11934 5.0218 0.48
0.187 51.01 12168 5.3269 0.4
0.0002 52.01 12402 5.4364 0.44
0.0001 53.01 12636 5.7307 0.44
0.0 54.01 12870 5.9781 0.44
0.0001 55.01 13104 4.8221 0.44
0.0001 56.01 13338 5.5808 0.4
0.0 57.01 13572 5.7662 0.44
0.0001 58.01 13806 5.4463 0.44
0.0021 59.01 14040 5.9576 0.44
0.5042 60.01 14274 5.9419 0.4
0.0053 61.01 14508 5.2977 0.48
0.0 62.01 14742 5.8541 0.4
0.1555 63.01 14976 6.5367 0.4
0.0081 64.01 15210 5.4808 0.4
0.0008 65.01 15444 5.8818 0.4
0.0 66.01 15678 6.4378 0.4
0.0 67.01 15912 5.6597 0.4
0.0 68.01 16146 5.8197 0.44
0.0061 69.01 16380 6.0141 0.4
0.0001 70.01 16614 6.2449 0.4
0.0001 71.01 16848 6.2530 0.4
0.0 72.01 17082 5.7655 0.4
0.0 73.01 17316 6.1521 0.4
0.0 74.01 17550 6.1597 0.44
0.6123 75.01 17784 6.4786 0.4
0.0 76.01 18018 6.5528 0.4
0.0 77.01 18252 5.5426 0.44
0.0 78.01 18486 6.4276 0.4
0.0 79.01 18720 6.8676 0.4
0.0 80.01 18954 6.6693 0.4
0.0 81.01 19188 6.7919 0.4
0.0 82.01 19422 6.7520 0.4
0.0 83.01 19656 6.7565 0.4
0.0 84.01 19890 6.8186 0.4
0.0 85.01 20124 6.5549 0.4
0.0 86.01 20358 6.7223 0.4
0.0 87.01 20592 6.9096 0.4
0.0 88.01 20826 6.9918 0.4
0.0 89.01 21060 7.2247 0.4
0.0001 90.01 21294 7.2267 0.4
0.0 91.01 21528 6.9826 0.4
0.0 92.01 21762 6.6385 0.4
0.792 93.01 21996 6.4020 0.4
0.0 94.01 22230 6.4453 0.4
0.0 95.01 22464 6.9102 0.4
0.0 96.01 22698 6.9262 0.4
0.0 97.01 22932 6.7757 0.4
0.0 98.01 23166 6.8298 0.4
0.0 99.01 23400 6.8317 0.4

Framework versions

  • Transformers 4.46.2
  • Pytorch 2.0.1+cu117
  • Datasets 3.0.1
  • Tokenizers 0.20.0
Downloads last month
3
Safetensors
Model size
304M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for beingbatman/5c_4

Finetuned
(6)
this model