ucf101_42

This model is a fine-tuned version of MCG-NJU/videomae-large on the ucf101 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3185
  • Accuracy: 0.9345

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 4
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 8
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 298 0.5806 0.8423
No log 2.0 596 0.5192 0.8653
No log 3.0 894 0.4903 0.8814
0.5923 4.0 1192 0.4623 0.8901
0.5923 5.0 1490 0.3949 0.9005
0.5923 6.0 1788 0.4748 0.8844
0.0581 7.0 2086 0.4877 0.8820
0.0581 8.0 2385 0.3976 0.9131
0.0581 9.0 2683 0.3824 0.9116
0.0581 10.0 2981 0.3553 0.9171
0.0221 11.0 3279 0.3557 0.9229
0.0221 12.0 3577 0.3619 0.9258
0.0221 13.0 3875 0.3941 0.9214
0.0112 14.0 4173 0.3989 0.9145
0.0112 15.0 4471 0.3635 0.9236
0.0112 16.0 4770 0.3418 0.9285
0.005 17.0 5068 0.3374 0.9261
0.005 18.0 5366 0.3340 0.9333
0.005 19.0 5664 0.3294 0.9338
0.005 19.99 5960 0.3185 0.9345

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.2.2+cu118
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
13
Safetensors
Model size
304M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for jialicheng/ucf101_videomae-large

Finetuned
(19)
this model