swin-large-mask2former-finetuned-Mitoliked-Mito-Innertongue
This model is a fine-tuned version of Dnq2025/swin-large-mask2former-finetuned-ER-Mito-LD8 on the Dnq2025/Mask2former_Finetune dataset. It achieves the following results on the evaluation set:
- Mean Iou: 0.4849
- Loss: 84.8457
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0004
- train_batch_size: 4
- eval_batch_size: 4
- seed: 1337
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: polynomial
- training_steps: 3225
Training results
Training Loss | Epoch | Step | Mean Iou | Validation Loss |
---|---|---|---|---|
No log | 16.6667 | 100 | 0.4737 | 34.6305 |
20.404 | 33.3333 | 200 | 0.4715 | 42.0013 |
10.0702 | 50.0 | 300 | 0.4774 | 46.1805 |
10.0702 | 66.6667 | 400 | 0.4821 | 50.9460 |
9.0859 | 83.3333 | 500 | 0.4920 | 48.5783 |
8.6094 | 100.0 | 600 | 0.4749 | 52.5121 |
8.6094 | 116.6667 | 700 | 0.4865 | 49.8120 |
8.4671 | 133.3333 | 800 | 0.4812 | 56.1730 |
8.3066 | 150.0 | 900 | 0.4934 | 60.8687 |
8.3066 | 166.6667 | 1000 | 0.4916 | 59.8411 |
8.1972 | 183.3333 | 1100 | 0.4916 | 63.6835 |
8.0926 | 200.0 | 1200 | 0.4905 | 60.2863 |
8.0926 | 216.6667 | 1300 | 0.4886 | 63.5495 |
8.0059 | 233.3333 | 1400 | 0.4924 | 65.0410 |
7.974 | 250.0 | 1500 | 0.4900 | 66.5468 |
7.974 | 266.6667 | 1600 | 0.4902 | 67.8736 |
8.0177 | 283.3333 | 1700 | 0.4851 | 68.4019 |
7.9356 | 300.0 | 1800 | 0.4894 | 74.3255 |
7.9356 | 316.6667 | 1900 | 0.4900 | 73.2780 |
7.8784 | 333.3333 | 2000 | 0.4899 | 75.2934 |
7.8445 | 350.0 | 2100 | 0.4878 | 77.9780 |
7.8445 | 366.6667 | 2200 | 0.4868 | 76.0033 |
7.7889 | 383.3333 | 2300 | 0.4879 | 78.4704 |
7.7828 | 400.0 | 2400 | 0.4860 | 77.1425 |
7.7828 | 416.6667 | 2500 | 0.4915 | 78.8530 |
7.7601 | 433.3333 | 2600 | 0.4875 | 81.9598 |
7.738 | 450.0 | 2700 | 0.4865 | 79.7205 |
7.738 | 466.6667 | 2800 | 0.4866 | 79.9854 |
7.7139 | 483.3333 | 2900 | 0.4869 | 82.6984 |
7.7043 | 500.0 | 3000 | 0.4847 | 81.8537 |
7.7043 | 516.6667 | 3100 | 0.4855 | 81.6806 |
7.6982 | 533.3333 | 3200 | 0.4850 | 85.2813 |
Framework versions
- Transformers 4.50.0.dev0
- Pytorch 2.4.1
- Datasets 3.3.2
- Tokenizers 0.21.0
- Downloads last month
- 32
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for Dnq2025/swin-large-mask2former-finetuned-Mitoliked-Mito-Innertongue
Base model
facebook/mask2former-swin-large-ade-semantic