timesformer-base-finetuned-k400-finetuned-yt_short_classification-3
This model is a fine-tuned version of facebook/timesformer-base-finetuned-k400 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.4771
- Accuracy: 0.8724
- 0 Precision: 0.8472
- 0 Recall: 0.8938
- 0 F1-score: 0.8699
- 0 Support: 24395.0
- 1 Precision: 0.8979
- 1 Recall: 0.8528
- 1 F1-score: 0.8748
- 1 Support: 26720.0
- Accuracy F1-score: 0.8724
- Macro avg Precision: 0.8726
- Macro avg Recall: 0.8733
- Macro avg F1-score: 0.8723
- Macro avg Support: 51115.0
- Weighted avg Precision: 0.8737
- Weighted avg Recall: 0.8724
- Weighted avg F1-score: 0.8725
- Weighted avg Support: 51115.0
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 39620
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | 0 Precision | 0 Recall | 0 F1-score | 0 Support | 1 Precision | 1 Recall | 1 F1-score | 1 Support | Accuracy F1-score | Macro avg Precision | Macro avg Recall | Macro avg F1-score | Macro avg Support | Weighted avg Precision | Weighted avg Recall | Weighted avg F1-score | Weighted avg Support |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0.6564 | 0.0500 | 1982 | 0.4807 | 0.7684 | 0.7250 | 0.8294 | 0.7737 | 24395.0 | 0.8206 | 0.7128 | 0.7629 | 26720.0 | 0.7684 | 0.7728 | 0.7711 | 0.7683 | 51115.0 | 0.7750 | 0.7684 | 0.7680 | 51115.0 |
0.6013 | 1.0500 | 3964 | 0.5336 | 0.7612 | 0.7301 | 0.7929 | 0.7602 | 24395.0 | 0.7948 | 0.7323 | 0.7623 | 26720.0 | 0.7612 | 0.7624 | 0.7626 | 0.7612 | 51115.0 | 0.7639 | 0.7612 | 0.7613 | 51115.0 |
0.4629 | 2.0500 | 5946 | 0.5388 | 0.7692 | 0.7280 | 0.8244 | 0.7732 | 24395.0 | 0.8176 | 0.7188 | 0.7650 | 26720.0 | 0.7692 | 0.7728 | 0.7716 | 0.7691 | 51115.0 | 0.7748 | 0.7692 | 0.7689 | 51115.0 |
0.6739 | 3.0500 | 7928 | 0.4304 | 0.8098 | 0.7891 | 0.8207 | 0.8046 | 24395.0 | 0.8301 | 0.7998 | 0.8147 | 26720.0 | 0.8098 | 0.8096 | 0.8102 | 0.8096 | 51115.0 | 0.8105 | 0.8098 | 0.8099 | 51115.0 |
0.2837 | 4.0500 | 9910 | 0.5067 | 0.8136 | 0.7818 | 0.8455 | 0.8124 | 24395.0 | 0.8476 | 0.7845 | 0.8148 | 26720.0 | 0.8136 | 0.8147 | 0.8150 | 0.8136 | 51115.0 | 0.8162 | 0.8136 | 0.8136 | 51115.0 |
0.6485 | 5.0500 | 11892 | 0.5121 | 0.8072 | 0.8035 | 0.7890 | 0.7962 | 24395.0 | 0.8105 | 0.8238 | 0.8171 | 26720.0 | 0.8072 | 0.8070 | 0.8064 | 0.8066 | 51115.0 | 0.8071 | 0.8072 | 0.8071 | 51115.0 |
0.5415 | 6.0500 | 13874 | 0.8758 | 0.6895 | 0.6096 | 0.9716 | 0.7492 | 24395.0 | 0.9434 | 0.4320 | 0.5926 | 26720.0 | 0.6895 | 0.7765 | 0.7018 | 0.6709 | 51115.0 | 0.7841 | 0.6895 | 0.6673 | 51115.0 |
0.3286 | 7.0500 | 15856 | 0.5110 | 0.8262 | 0.8450 | 0.7788 | 0.8105 | 24395.0 | 0.8115 | 0.8695 | 0.8395 | 26720.0 | 0.8262 | 0.8282 | 0.8242 | 0.8250 | 51115.0 | 0.8275 | 0.8262 | 0.8257 | 51115.0 |
0.3357 | 8.0500 | 17838 | 0.4913 | 0.8278 | 0.8414 | 0.7876 | 0.8136 | 24395.0 | 0.8168 | 0.8644 | 0.8399 | 26720.0 | 0.8278 | 0.8291 | 0.8260 | 0.8268 | 51115.0 | 0.8285 | 0.8278 | 0.8274 | 51115.0 |
0.3095 | 9.0500 | 19820 | 0.5020 | 0.8467 | 0.8483 | 0.8267 | 0.8374 | 24395.0 | 0.8454 | 0.8650 | 0.8551 | 26720.0 | 0.8467 | 0.8468 | 0.8458 | 0.8462 | 51115.0 | 0.8468 | 0.8467 | 0.8466 | 51115.0 |
0.6872 | 10.0500 | 21802 | 0.6839 | 0.7834 | 0.7049 | 0.9395 | 0.8055 | 24395.0 | 0.9207 | 0.6409 | 0.7558 | 26720.0 | 0.7834 | 0.8128 | 0.7902 | 0.7806 | 51115.0 | 0.8177 | 0.7834 | 0.7795 | 51115.0 |
0.2417 | 11.0500 | 23784 | 0.7490 | 0.8001 | 0.7235 | 0.9408 | 0.8180 | 24395.0 | 0.9256 | 0.6717 | 0.7784 | 26720.0 | 0.8001 | 0.8245 | 0.8063 | 0.7982 | 51115.0 | 0.8291 | 0.8001 | 0.7973 | 51115.0 |
0.6484 | 12.0500 | 25766 | 0.4507 | 0.8448 | 0.8540 | 0.8139 | 0.8335 | 24395.0 | 0.8371 | 0.8730 | 0.8547 | 26720.0 | 0.8448 | 0.8456 | 0.8435 | 0.8441 | 51115.0 | 0.8452 | 0.8448 | 0.8446 | 51115.0 |
0.4147 | 13.0500 | 27748 | 0.4223 | 0.8620 | 0.8307 | 0.8927 | 0.8606 | 24395.0 | 0.8949 | 0.8339 | 0.8633 | 26720.0 | 0.8620 | 0.8628 | 0.8633 | 0.8620 | 51115.0 | 0.8643 | 0.8620 | 0.8620 | 51115.0 |
0.6485 | 14.0500 | 29730 | 0.4759 | 0.8548 | 0.8523 | 0.8416 | 0.8470 | 24395.0 | 0.8571 | 0.8669 | 0.8619 | 26720.0 | 0.8548 | 0.8547 | 0.8543 | 0.8545 | 51115.0 | 0.8548 | 0.8548 | 0.8548 | 51115.0 |
0.3193 | 15.0500 | 31712 | 0.5955 | 0.8311 | 0.7551 | 0.9561 | 0.8438 | 24395.0 | 0.9471 | 0.7170 | 0.8161 | 26720.0 | 0.8311 | 0.8511 | 0.8365 | 0.8300 | 51115.0 | 0.8555 | 0.8311 | 0.8293 | 51115.0 |
0.4384 | 16.0500 | 33694 | 0.4914 | 0.8567 | 0.8308 | 0.8788 | 0.8541 | 24395.0 | 0.8832 | 0.8366 | 0.8592 | 26720.0 | 0.8567 | 0.8570 | 0.8577 | 0.8567 | 51115.0 | 0.8582 | 0.8567 | 0.8568 | 51115.0 |
0.2316 | 17.0500 | 35676 | 0.4951 | 0.8621 | 0.8282 | 0.8971 | 0.8613 | 24395.0 | 0.8983 | 0.8301 | 0.8629 | 26720.0 | 0.8621 | 0.8633 | 0.8636 | 0.8621 | 51115.0 | 0.8649 | 0.8621 | 0.8621 | 51115.0 |
0.3014 | 18.0500 | 37658 | 0.5001 | 0.8654 | 0.8245 | 0.9122 | 0.8662 | 24395.0 | 0.9113 | 0.8227 | 0.8647 | 26720.0 | 0.8654 | 0.8679 | 0.8675 | 0.8654 | 51115.0 | 0.8698 | 0.8654 | 0.8654 | 51115.0 |
0.2855 | 19.0495 | 39620 | 0.4771 | 0.8724 | 0.8472 | 0.8938 | 0.8699 | 24395.0 | 0.8979 | 0.8528 | 0.8748 | 26720.0 | 0.8724 | 0.8726 | 0.8733 | 0.8723 | 51115.0 | 0.8737 | 0.8724 | 0.8725 | 51115.0 |
Framework versions
- Transformers 4.46.3
- Pytorch 2.0.0+cu117
- Datasets 3.1.0
- Tokenizers 0.20.3
- Downloads last month
- 20
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for Kartikeya/timesformer-base-finetuned-k400-finetuned-yt_short_classification-3
Base model
facebook/timesformer-base-finetuned-k400