custom-object-masking_v4-windows

This model is a fine-tuned version of nvidia/mit-b0 on the sungile/custom-object-masking_v4_windows dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0683
  • Mean Iou: 0.8920
  • Mean Accuracy: 0.9387
  • Overall Accuracy: 0.9781
  • Accuracy Unknown: nan
  • Accuracy Background: 0.9894
  • Accuracy Windows: 0.9326
  • Accuracy Doors: 0.8941
  • Iou Unknown: nan
  • Iou Background: 0.9766
  • Iou Windows: 0.8664
  • Iou Doors: 0.8330

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 13

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Unknown Accuracy Background Accuracy Windows Accuracy Doors Iou Unknown Iou Background Iou Windows Iou Doors
1.1123 0.0889 20 1.2340 0.3693 0.7755 0.7972 nan 0.8019 0.7161 0.8085 0.0 0.7971 0.3814 0.2986
0.8576 0.1778 40 0.8315 0.4604 0.7890 0.8964 nan 0.9248 0.6823 0.7599 0.0 0.9086 0.5196 0.4135
0.6731 0.2667 60 0.6788 0.4918 0.8207 0.9163 nan 0.9450 0.8512 0.6659 0.0 0.9246 0.5809 0.4618
0.6596 0.3556 80 0.5392 0.6508 0.8229 0.9103 nan 0.9352 0.7984 0.7351 nan 0.9185 0.5802 0.4539
0.6217 0.4444 100 0.4434 0.6832 0.8085 0.9291 nan 0.9635 0.7790 0.6831 nan 0.9383 0.6202 0.4909
0.4555 0.5333 120 0.3915 0.6932 0.8107 0.9327 nan 0.9688 0.8269 0.6364 nan 0.9405 0.6506 0.4885
0.4429 0.6222 140 0.3746 0.6948 0.8375 0.9266 nan 0.9516 0.8020 0.7591 nan 0.9344 0.6549 0.4951
0.3978 0.7111 160 0.3440 0.6987 0.8443 0.9318 nan 0.9585 0.8851 0.6895 nan 0.9398 0.6426 0.5138
0.5 0.8 180 0.3016 0.7152 0.8231 0.9397 nan 0.9738 0.8225 0.6731 nan 0.9468 0.6642 0.5345
0.2722 0.8889 200 0.3163 0.7193 0.8670 0.9353 nan 0.9553 0.8699 0.7760 nan 0.9423 0.6688 0.5467
0.327 0.9778 220 0.2595 0.7225 0.8337 0.9399 nan 0.9701 0.8042 0.7270 nan 0.9464 0.6715 0.5498
0.3964 1.0667 240 0.2516 0.7305 0.8162 0.9458 nan 0.9834 0.8069 0.6582 nan 0.9516 0.6757 0.5643
0.2407 1.1556 260 0.2199 0.7422 0.8589 0.9458 nan 0.9715 0.8708 0.7343 nan 0.9526 0.6906 0.5833
0.3972 1.2444 280 0.2210 0.7424 0.8679 0.9442 nan 0.9667 0.8737 0.7634 nan 0.9511 0.6975 0.5786
0.1568 1.3333 300 0.2047 0.7442 0.8495 0.9465 nan 0.9747 0.8460 0.7277 nan 0.9525 0.6979 0.5822
0.2474 1.4222 320 0.2032 0.7476 0.8592 0.9458 nan 0.9699 0.8170 0.7908 nan 0.9522 0.7013 0.5895
0.183 1.5111 340 0.2112 0.7454 0.8717 0.9423 nan 0.9618 0.8341 0.8193 nan 0.9480 0.7119 0.5762
0.2524 1.6 360 0.1827 0.7435 0.8450 0.9477 nan 0.9779 0.8526 0.7045 nan 0.9546 0.6965 0.5792
0.1665 1.6889 380 0.1746 0.7608 0.8626 0.9514 nan 0.9767 0.8368 0.7743 nan 0.9582 0.7052 0.6191
0.618 1.7778 400 0.1765 0.7565 0.8753 0.9486 nan 0.9696 0.8609 0.7955 nan 0.9556 0.7051 0.6088
0.3693 1.8667 420 0.1629 0.7653 0.8720 0.9505 nan 0.9727 0.8434 0.7999 nan 0.9563 0.7243 0.6152
0.2788 1.9556 440 0.1523 0.7681 0.8712 0.9526 nan 0.9757 0.8483 0.7897 nan 0.9591 0.7216 0.6235
0.09 2.0444 460 0.1746 0.7438 0.8712 0.9422 nan 0.9605 0.7828 0.8704 nan 0.9494 0.6970 0.5850
0.1173 2.1333 480 0.1417 0.7871 0.8677 0.9580 nan 0.9834 0.8333 0.7864 nan 0.9623 0.7330 0.6659
0.1552 2.2222 500 0.1581 0.7588 0.8610 0.9546 nan 0.9835 0.9186 0.6810 nan 0.9639 0.6937 0.6186
0.2021 2.3111 520 0.1400 0.7786 0.8658 0.9581 nan 0.9860 0.9008 0.7105 nan 0.9646 0.7254 0.6459
0.2156 2.4 540 0.1332 0.7945 0.8729 0.9596 nan 0.9849 0.8701 0.7638 nan 0.9638 0.7560 0.6636
0.3223 2.4889 560 0.1371 0.7754 0.8454 0.9579 nan 0.9919 0.8833 0.6609 nan 0.9636 0.7414 0.6213
0.1204 2.5778 580 0.1370 0.8055 0.9040 0.9582 nan 0.9733 0.8797 0.8589 nan 0.9602 0.7812 0.6750
0.1454 2.6667 600 0.1290 0.8106 0.8990 0.9602 nan 0.9775 0.8800 0.8395 nan 0.9621 0.7856 0.6842
0.2149 2.7556 620 0.1375 0.7664 0.8532 0.9559 nan 0.9869 0.8897 0.6830 nan 0.9639 0.7244 0.6110
0.0912 2.8444 640 0.1251 0.7898 0.8694 0.9594 nan 0.9854 0.8598 0.7631 nan 0.9648 0.7437 0.6609
0.0827 2.9333 660 0.1242 0.7959 0.8868 0.9584 nan 0.9789 0.8726 0.8088 nan 0.9625 0.7624 0.6628
0.1986 3.0222 680 0.1324 0.7794 0.8818 0.9568 nan 0.9800 0.9269 0.7387 nan 0.9633 0.7294 0.6456
0.0993 3.1111 700 0.1158 0.8141 0.8824 0.9643 nan 0.9875 0.8582 0.8014 nan 0.9677 0.7681 0.7065
0.1709 3.2 720 0.1128 0.8159 0.9045 0.9629 nan 0.9800 0.9043 0.8293 nan 0.9657 0.7750 0.7070
0.1923 3.2889 740 0.1125 0.8201 0.8913 0.9651 nan 0.9865 0.8874 0.7998 nan 0.9680 0.7802 0.7120
0.0559 3.3778 760 0.1138 0.8247 0.9117 0.9635 nan 0.9780 0.8874 0.8696 nan 0.9652 0.7963 0.7127
0.2231 3.4667 780 0.1084 0.8270 0.9147 0.9642 nan 0.9782 0.9006 0.8654 nan 0.9659 0.7969 0.7184
0.0587 3.5556 800 0.1042 0.8296 0.9132 0.9662 nan 0.9817 0.9159 0.8419 nan 0.9687 0.7893 0.7309
0.0754 3.6444 820 0.1063 0.8324 0.8984 0.9669 nan 0.9858 0.8566 0.8530 nan 0.9689 0.7947 0.7336
0.2001 3.7333 840 0.1188 0.8069 0.9188 0.9595 nan 0.9716 0.9253 0.8594 nan 0.9635 0.7752 0.6820
0.0732 3.8222 860 0.1036 0.8321 0.8903 0.9669 nan 0.9894 0.8944 0.7872 nan 0.9681 0.8110 0.7172
0.1668 3.9111 880 0.1343 0.7993 0.9242 0.9515 nan 0.9588 0.9001 0.9138 nan 0.9518 0.8042 0.6419
0.1307 4.0 900 0.1041 0.8276 0.8938 0.9658 nan 0.9870 0.8990 0.7954 nan 0.9676 0.8055 0.7099
0.0929 4.0889 920 0.1091 0.8127 0.8989 0.9643 nan 0.9843 0.9304 0.7819 nan 0.9690 0.7628 0.7062
0.0936 4.1778 940 0.0985 0.8428 0.9230 0.9680 nan 0.9808 0.9077 0.8804 nan 0.9695 0.8142 0.7448
0.0645 4.2667 960 0.1019 0.8312 0.9123 0.9665 nan 0.9824 0.9162 0.8382 nan 0.9691 0.7985 0.7259
0.1122 4.3556 980 0.1075 0.8140 0.9010 0.9643 nan 0.9836 0.9308 0.7886 nan 0.9687 0.7659 0.7073
0.3607 4.4444 1000 0.1186 0.7965 0.8587 0.9619 nan 0.9932 0.8998 0.6830 nan 0.9658 0.7641 0.6595
0.0704 4.5333 1020 0.0950 0.8454 0.9081 0.9694 nan 0.9873 0.9105 0.8265 nan 0.9700 0.8137 0.7524
0.0472 4.6222 1040 0.1128 0.8245 0.9303 0.9609 nan 0.9697 0.9247 0.8965 nan 0.9610 0.8096 0.7029
0.1245 4.7111 1060 0.0963 0.8363 0.8919 0.9684 nan 0.9911 0.9038 0.7808 nan 0.9700 0.8114 0.7275
0.0474 4.8 1080 0.0948 0.8379 0.9114 0.9686 nan 0.9861 0.9408 0.8074 nan 0.9711 0.8092 0.7335
0.0584 4.8889 1100 0.0835 0.8650 0.9217 0.9734 nan 0.9887 0.9271 0.8494 nan 0.9736 0.8367 0.7847
0.0584 4.9778 1120 0.0965 0.8343 0.9248 0.9669 nan 0.9796 0.9366 0.8583 nan 0.9693 0.7901 0.7436
0.3062 5.0667 1140 0.0948 0.8471 0.8990 0.9698 nan 0.9912 0.9240 0.7819 nan 0.9698 0.8270 0.7445
0.0905 5.1556 1160 0.0866 0.8609 0.9318 0.9719 nan 0.9839 0.9407 0.8709 nan 0.9721 0.8302 0.7804
0.0655 5.2444 1180 0.0855 0.8616 0.9268 0.9730 nan 0.9870 0.9435 0.8500 nan 0.9740 0.8257 0.7852
0.0897 5.3333 1200 0.0820 0.8718 0.9249 0.9741 nan 0.9886 0.9286 0.8576 nan 0.9731 0.8442 0.7981
0.1252 5.4222 1220 0.0859 0.8590 0.9322 0.9714 nan 0.9830 0.9375 0.8759 nan 0.9716 0.8288 0.7767
0.0787 5.5111 1240 0.0942 0.8528 0.9303 0.9687 nan 0.9796 0.9162 0.8952 nan 0.9687 0.8424 0.7472
0.1752 5.6 1260 0.0880 0.8530 0.9197 0.9710 nan 0.9861 0.9240 0.8490 nan 0.9718 0.8143 0.7728
0.12 5.6889 1280 0.0856 0.8568 0.9222 0.9705 nan 0.9837 0.8913 0.8914 nan 0.9700 0.8290 0.7715
0.0789 5.7778 1300 0.0836 0.8612 0.9284 0.9716 nan 0.9837 0.9101 0.8913 nan 0.9713 0.8310 0.7814
0.1557 5.8667 1320 0.0829 0.8664 0.9300 0.9722 nan 0.9841 0.9136 0.8925 nan 0.9708 0.8366 0.7918
0.1868 5.9556 1340 0.0839 0.8640 0.9222 0.9722 nan 0.9870 0.9259 0.8537 nan 0.9711 0.8349 0.7859
0.1028 6.0444 1360 0.0835 0.8647 0.9297 0.9725 nan 0.9851 0.9320 0.8720 nan 0.9719 0.8302 0.7919
0.1021 6.1333 1380 0.0860 0.8572 0.9267 0.9717 nan 0.9847 0.9233 0.8722 nan 0.9724 0.8194 0.7796
0.116 6.2222 1400 0.0883 0.8487 0.9263 0.9704 nan 0.9838 0.9442 0.8508 nan 0.9720 0.8029 0.7712
0.0726 6.3111 1420 0.0856 0.8669 0.9392 0.9718 nan 0.9809 0.9245 0.9122 nan 0.9705 0.8425 0.7878
0.0746 6.4 1440 0.0902 0.8406 0.9123 0.9699 nan 0.9873 0.9347 0.8149 nan 0.9726 0.7909 0.7584
0.0596 6.4889 1460 0.0823 0.8620 0.9262 0.9724 nan 0.9860 0.9285 0.8641 nan 0.9725 0.8303 0.7832
0.127 6.5778 1480 0.0830 0.8667 0.9384 0.9722 nan 0.9821 0.9397 0.8933 nan 0.9711 0.8394 0.7896
0.0511 6.6667 1500 0.0800 0.8746 0.9345 0.9744 nan 0.9862 0.9391 0.8783 nan 0.9733 0.8470 0.8037
0.1304 6.7556 1520 0.0829 0.8658 0.9339 0.9723 nan 0.9834 0.9275 0.8907 nan 0.9717 0.8382 0.7875
0.0768 6.8444 1540 0.0850 0.8595 0.9074 0.9724 nan 0.9915 0.9135 0.8173 nan 0.9722 0.8349 0.7714
0.2556 6.9333 1560 0.0792 0.8681 0.9243 0.9738 nan 0.9883 0.9252 0.8594 nan 0.9737 0.8408 0.7899
0.0446 7.0222 1580 0.0819 0.8497 0.9251 0.9712 nan 0.9853 0.9505 0.8394 nan 0.9732 0.7965 0.7795
0.0593 7.1111 1600 0.0791 0.8774 0.9290 0.9747 nan 0.9881 0.9297 0.8691 nan 0.9726 0.8486 0.8110
0.0997 7.2 1620 0.0786 0.8763 0.9426 0.9743 nan 0.9837 0.9473 0.8969 nan 0.9727 0.8386 0.8176
0.2284 7.2889 1640 0.0815 0.8654 0.9390 0.9724 nan 0.9819 0.9316 0.9034 nan 0.9719 0.8306 0.7937
0.0501 7.3778 1660 0.0833 0.8658 0.9416 0.9726 nan 0.9820 0.9518 0.8912 nan 0.9723 0.8280 0.7972
0.0553 7.4667 1680 0.0754 0.8819 0.9420 0.9755 nan 0.9849 0.9251 0.9160 nan 0.9741 0.8524 0.8191
0.1392 7.5556 1700 0.0739 0.8882 0.9416 0.9772 nan 0.9879 0.9538 0.8831 nan 0.9756 0.8611 0.8278
0.0422 7.6444 1720 0.0753 0.8764 0.9408 0.9749 nan 0.9850 0.9460 0.8915 nan 0.9739 0.8377 0.8176
0.0591 7.7333 1740 0.0766 0.8744 0.9346 0.9744 nan 0.9863 0.9436 0.8741 nan 0.9731 0.8402 0.8100
0.0515 7.8222 1760 0.0766 0.8725 0.9392 0.9738 nan 0.9836 0.9255 0.9084 nan 0.9730 0.8395 0.8051
0.025 7.9111 1780 0.0750 0.8770 0.9358 0.9751 nan 0.9861 0.9211 0.9003 nan 0.9742 0.8442 0.8126
0.2387 8.0 1800 0.0771 0.8714 0.9332 0.9737 nan 0.9850 0.9144 0.9003 nan 0.9731 0.8443 0.7969
0.1592 8.0889 1820 0.0772 0.8734 0.9327 0.9744 nan 0.9859 0.9113 0.9008 nan 0.9741 0.8491 0.7969
0.0835 8.1778 1840 0.0764 0.8705 0.9377 0.9742 nan 0.9850 0.9435 0.8846 nan 0.9745 0.8396 0.7972
0.0639 8.2667 1860 0.0743 0.8828 0.9303 0.9765 nan 0.9898 0.9249 0.8762 nan 0.9758 0.8624 0.8103
0.0841 8.3556 1880 0.0735 0.8812 0.9328 0.9762 nan 0.9883 0.9149 0.8952 nan 0.9754 0.8522 0.8159
0.0946 8.4444 1900 0.0784 0.8643 0.9344 0.9737 nan 0.9856 0.9495 0.8682 nan 0.9749 0.8236 0.7946
0.3774 8.5333 1920 0.0726 0.8799 0.9349 0.9761 nan 0.9875 0.9119 0.9054 nan 0.9757 0.8478 0.8162
0.0636 8.6222 1940 0.0745 0.8739 0.9375 0.9747 nan 0.9856 0.9378 0.8892 nan 0.9744 0.8392 0.8081
0.0278 8.7111 1960 0.0804 0.8627 0.9189 0.9736 nan 0.9898 0.9295 0.8373 nan 0.9744 0.8272 0.7864
0.147 8.8 1980 0.0780 0.8670 0.9356 0.9735 nan 0.9848 0.9443 0.8778 nan 0.9736 0.8280 0.7996
0.0254 8.8889 2000 0.0734 0.8792 0.9338 0.9756 nan 0.9876 0.9245 0.8894 nan 0.9747 0.8488 0.8140
0.0302 8.9778 2020 0.0750 0.8792 0.9436 0.9753 nan 0.9849 0.9549 0.8910 nan 0.9740 0.8433 0.8204
0.0245 9.0667 2040 0.0744 0.8834 0.9357 0.9764 nan 0.9886 0.9488 0.8698 nan 0.9750 0.8559 0.8193
0.1541 9.1556 2060 0.0733 0.8835 0.9418 0.9759 nan 0.9858 0.9406 0.8988 nan 0.9743 0.8581 0.8181
0.0589 9.2444 2080 0.0740 0.8764 0.9425 0.9745 nan 0.9835 0.9304 0.9137 nan 0.9733 0.8434 0.8123
0.0289 9.3333 2100 0.0750 0.8771 0.9321 0.9754 nan 0.9884 0.9408 0.8671 nan 0.9747 0.8456 0.8111
0.1783 9.4222 2120 0.0756 0.8711 0.9326 0.9746 nan 0.9872 0.9455 0.8652 nan 0.9746 0.8368 0.8018
0.0317 9.5111 2140 0.0733 0.8812 0.9345 0.9761 nan 0.9884 0.9375 0.8775 nan 0.9753 0.8561 0.8122
0.0838 9.6 2160 0.0720 0.8839 0.9408 0.9764 nan 0.9867 0.9380 0.8978 nan 0.9754 0.8576 0.8187
0.0384 9.6889 2180 0.0710 0.8858 0.9383 0.9770 nan 0.9879 0.9249 0.9021 nan 0.9763 0.8626 0.8186
0.0713 9.7778 2200 0.0705 0.8875 0.9338 0.9776 nan 0.9901 0.9261 0.8853 nan 0.9768 0.8640 0.8216
0.033 9.8667 2220 0.0695 0.8869 0.9405 0.9772 nan 0.9875 0.9279 0.9061 nan 0.9762 0.8586 0.8260
0.0204 9.9556 2240 0.0705 0.8894 0.9422 0.9775 nan 0.9878 0.9414 0.8972 nan 0.9762 0.8632 0.8288
0.3245 10.0444 2260 0.0690 0.8900 0.9355 0.9780 nan 0.9901 0.9254 0.8910 nan 0.9770 0.8653 0.8275
0.0184 10.1333 2280 0.0679 0.8922 0.9441 0.9781 nan 0.9878 0.9395 0.9049 nan 0.9767 0.8646 0.8353
0.0205 10.2222 2300 0.0710 0.8867 0.9450 0.9768 nan 0.9861 0.9444 0.9044 nan 0.9753 0.8526 0.8322
0.0526 10.3111 2320 0.0726 0.8875 0.9395 0.9770 nan 0.9881 0.9461 0.8841 nan 0.9753 0.8605 0.8266
0.0489 10.4 2340 0.0709 0.8892 0.9355 0.9774 nan 0.9896 0.9321 0.8847 nan 0.9758 0.8616 0.8303
0.0243 10.4889 2360 0.0693 0.8908 0.9393 0.9777 nan 0.9887 0.9326 0.8965 nan 0.9761 0.8640 0.8324
0.0283 10.5778 2380 0.0704 0.8927 0.9382 0.9781 nan 0.9898 0.9399 0.8849 nan 0.9764 0.8723 0.8294
0.0422 10.6667 2400 0.0698 0.8904 0.9416 0.9775 nan 0.9878 0.9351 0.9020 nan 0.9759 0.8636 0.8316
0.0571 10.7556 2420 0.0706 0.8883 0.9488 0.9768 nan 0.9848 0.9423 0.9194 nan 0.9753 0.8606 0.8290
0.0175 10.8444 2440 0.0687 0.8933 0.9401 0.9782 nan 0.9892 0.9322 0.8989 nan 0.9766 0.8678 0.8355
0.0684 10.9333 2460 0.0681 0.8930 0.9464 0.9780 nan 0.9873 0.9478 0.9040 nan 0.9764 0.8660 0.8365
0.0248 11.0222 2480 0.0693 0.8920 0.9473 0.9778 nan 0.9868 0.9527 0.9024 nan 0.9761 0.8638 0.8362
0.0499 11.1111 2500 0.0725 0.8840 0.9326 0.9769 nan 0.9902 0.9451 0.8625 nan 0.9760 0.8568 0.8193
0.072 11.2 2520 0.0706 0.8901 0.9404 0.9775 nan 0.9884 0.9432 0.8895 nan 0.9758 0.8625 0.8318
0.0304 11.2889 2540 0.0696 0.8899 0.9450 0.9773 nan 0.9866 0.9415 0.9069 nan 0.9755 0.8602 0.8340
0.0397 11.3778 2560 0.0695 0.8902 0.9425 0.9774 nan 0.9876 0.9400 0.8999 nan 0.9757 0.8606 0.8343
0.1503 11.4667 2580 0.0709 0.8892 0.9462 0.9771 nan 0.9863 0.9492 0.9031 nan 0.9753 0.8584 0.8339
0.0692 11.5556 2600 0.0708 0.8899 0.9454 0.9772 nan 0.9866 0.9456 0.9040 nan 0.9755 0.8624 0.8319
0.0771 11.6444 2620 0.0710 0.8864 0.9399 0.9771 nan 0.9881 0.9458 0.8857 nan 0.9759 0.8555 0.8277
0.0751 11.7333 2640 0.0692 0.8897 0.9395 0.9776 nan 0.9884 0.9280 0.9020 nan 0.9762 0.8608 0.8320
0.0356 11.8222 2660 0.0716 0.8855 0.9430 0.9767 nan 0.9867 0.9485 0.8939 nan 0.9755 0.8545 0.8266
0.0707 11.9111 2680 0.0705 0.8908 0.9405 0.9777 nan 0.9886 0.9422 0.8909 nan 0.9761 0.8669 0.8293
0.015 12.0 2700 0.0687 0.8915 0.9390 0.9780 nan 0.9891 0.9297 0.8981 nan 0.9766 0.8657 0.8321
0.0334 12.0889 2720 0.0691 0.8862 0.9417 0.9769 nan 0.9868 0.9258 0.9124 nan 0.9759 0.8552 0.8274
0.0161 12.1778 2740 0.0690 0.8897 0.9456 0.9774 nan 0.9866 0.9415 0.9089 nan 0.9759 0.8593 0.8340
0.0728 12.2667 2760 0.0686 0.8917 0.9406 0.9780 nan 0.9888 0.9357 0.8974 nan 0.9765 0.8646 0.8339
0.0517 12.3556 2780 0.0696 0.8918 0.9398 0.9780 nan 0.9892 0.9419 0.8884 nan 0.9764 0.8661 0.8328
0.0445 12.4444 2800 0.0689 0.8916 0.9444 0.9778 nan 0.9874 0.9403 0.9055 nan 0.9762 0.8641 0.8344
0.0376 12.5333 2820 0.0696 0.8907 0.9454 0.9775 nan 0.9869 0.9438 0.9055 nan 0.9759 0.8621 0.8341
0.0233 12.6222 2840 0.0688 0.8911 0.9436 0.9777 nan 0.9875 0.9380 0.9054 nan 0.9762 0.8632 0.8338
0.2039 12.7111 2860 0.0684 0.8919 0.9383 0.9781 nan 0.9895 0.9317 0.8937 nan 0.9766 0.8659 0.8332
0.1103 12.8 2880 0.0679 0.8921 0.9386 0.9781 nan 0.9894 0.9283 0.8980 nan 0.9767 0.8659 0.8336
0.0528 12.8889 2900 0.0686 0.8918 0.9409 0.9780 nan 0.9887 0.9380 0.8959 nan 0.9765 0.8652 0.8337
0.0323 12.9778 2920 0.0683 0.8920 0.9387 0.9781 nan 0.9894 0.9326 0.8941 nan 0.9766 0.8664 0.8330

Framework versions

  • Transformers 4.47.1
  • Pytorch 2.5.1+cu124
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
11
Safetensors
Model size
3.72M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for sungile/custom-object-masking_v4-windows

Base model

nvidia/mit-b0
Finetuned
(352)
this model