--- library_name: transformers tags: - generated_from_trainer model-index: - name: videomae-diving48-multilabel-finetuned results: [] --- # videomae-diving48-multilabel-finetuned This model is a fine-tuned version of [](https://huggingface.co/) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.3969 - F1 Macro: 0.2636 - Precision Macro: 0.2180 - Recall Macro: 0.4145 - Exact Match Ratio: 0.0003 - Hamming Accuracy: 0.7557 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - training_steps: 25544 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | F1 Macro | Precision Macro | Recall Macro | Exact Match Ratio | Hamming Accuracy | |:-------------:|:------:|:-----:|:---------------:|:--------:|:---------------:|:------------:|:-----------------:|:----------------:| | 1.3045 | 0.1250 | 3194 | 1.4774 | 0.1609 | 0.1518 | 0.2915 | 0.0 | 0.7280 | | 0.8881 | 1.1250 | 6388 | 1.4002 | 0.2035 | 0.1952 | 0.3239 | 0.0 | 0.7537 | | 1.0683 | 2.1250 | 9582 | 1.4017 | 0.2014 | 0.1999 | 0.3470 | 0.0 | 0.7403 | | 0.7672 | 3.1250 | 12776 | 1.4316 | 0.2280 | 0.1893 | 0.3459 | 0.0022 | 0.7566 | | 0.8529 | 4.1250 | 15970 | 1.4307 | 0.2333 | 0.2011 | 0.3484 | 0.0003 | 0.7650 | | 1.0022 | 5.1250 | 19164 | 1.4566 | 0.2367 | 0.2095 | 0.3404 | 0.0005 | 0.7734 | | 1.3422 | 6.1250 | 22358 | 1.4297 | 0.2602 | 0.2127 | 0.3922 | 0.0005 | 0.7634 | | 1.0641 | 7.1247 | 25544 | 1.3969 | 0.2636 | 0.2180 | 0.4145 | 0.0003 | 0.7557 | ### Framework versions - Transformers 4.51.3 - Pytorch 2.1.0+cu118 - Datasets 3.6.0 - Tokenizers 0.21.1