mit-b0-03-04-25-15-21_necrosis

This model is a fine-tuned version of nvidia/mit-b0 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1459
  • Mean Iou: 0.7491
  • Mean Accuracy: 0.8053
  • Overall Accuracy: 0.9465
  • Accuracy Background: 0.9950
  • Accuracy Necrosis: 0.5999
  • Accuracy Root: 0.8210
  • Iou Background: 0.9514
  • Iou Necrosis: 0.5271
  • Iou Root: 0.7687

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.05
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Background Accuracy Necrosis Accuracy Root Iou Background Iou Necrosis Iou Root
1.0286 0.625 20 1.0682 0.2971 0.5482 0.4605 0.3950 0.5573 0.6921 0.3931 0.0404 0.4577
0.8087 1.25 40 0.9066 0.4491 0.6758 0.6734 0.6231 0.5144 0.8901 0.6221 0.0597 0.6654
0.5207 1.875 60 0.7452 0.5011 0.7573 0.7185 0.6773 0.7209 0.8737 0.6762 0.0877 0.7393
0.4503 2.5 80 0.5908 0.5902 0.8478 0.8390 0.8422 0.8815 0.8199 0.8346 0.1719 0.7639
0.3617 3.125 100 0.7355 0.4588 0.7862 0.7203 0.6706 0.7915 0.8965 0.6659 0.1379 0.5726
0.2934 3.75 120 0.3994 0.6875 0.8473 0.9220 0.9489 0.7428 0.8502 0.9244 0.3873 0.7507
0.3384 4.375 140 0.3423 0.6961 0.8540 0.9276 0.9587 0.7662 0.8371 0.9332 0.3891 0.7659
0.3014 5.0 160 0.3397 0.6933 0.8418 0.9209 0.9532 0.7439 0.8282 0.9239 0.4393 0.7167
0.3316 5.625 180 0.3360 0.6717 0.8395 0.9088 0.9339 0.7433 0.8414 0.9076 0.4098 0.6979
0.2285 6.25 200 0.2552 0.7329 0.8140 0.9439 0.9727 0.5721 0.8973 0.9498 0.4681 0.7810
0.1486 6.875 220 0.2143 0.7410 0.8183 0.9470 0.9837 0.6061 0.8651 0.9526 0.4807 0.7898
0.2264 7.5 240 0.2154 0.7168 0.7719 0.9352 0.9956 0.5491 0.7710 0.9366 0.4871 0.7265
0.172 8.125 260 0.1926 0.7330 0.7920 0.9444 0.9906 0.5496 0.8358 0.9491 0.4799 0.7700
0.1841 8.75 280 0.2086 0.7252 0.8199 0.9352 0.9926 0.7112 0.7559 0.9405 0.5125 0.7225
0.1422 9.375 300 0.1668 0.7427 0.8002 0.9473 0.9922 0.5676 0.8409 0.9528 0.4972 0.7781
0.1747 10.0 320 0.1877 0.7257 0.8219 0.9401 0.9943 0.6956 0.7756 0.9496 0.4864 0.7409
0.1492 10.625 340 0.2011 0.7252 0.8346 0.9387 0.9887 0.7309 0.7842 0.9485 0.4869 0.7402
0.1835 11.25 360 0.1758 0.7429 0.8284 0.9438 0.9953 0.7006 0.7893 0.9512 0.5228 0.7547
0.2357 11.875 380 0.1770 0.7439 0.8264 0.9441 0.9941 0.6874 0.7978 0.9508 0.5223 0.7587
0.1232 12.5 400 0.1619 0.7534 0.8204 0.9504 0.9918 0.6205 0.8490 0.9575 0.5127 0.7901
0.1584 13.125 420 0.1911 0.7264 0.8091 0.9366 0.9947 0.6714 0.7613 0.9416 0.5111 0.7265
0.2099 13.75 440 0.1593 0.7495 0.8354 0.9485 0.9868 0.6695 0.8500 0.9563 0.5053 0.7869
0.2723 14.375 460 0.1668 0.7353 0.8029 0.9426 0.9906 0.5995 0.8185 0.9477 0.4999 0.7583
0.1406 15.0 480 0.1714 0.7404 0.8172 0.9426 0.9918 0.6555 0.8043 0.9477 0.5170 0.7564
0.1624 15.625 500 0.1527 0.7445 0.8007 0.9511 0.9913 0.5435 0.8674 0.9591 0.4811 0.7934
0.1519 16.25 520 0.1503 0.7459 0.8044 0.9495 0.9936 0.5743 0.8455 0.9568 0.4973 0.7837
0.0996 16.875 540 0.1437 0.7551 0.8216 0.9524 0.9898 0.6065 0.8685 0.9607 0.5066 0.7981
0.3985 17.5 560 0.1520 0.7355 0.7891 0.9472 0.9931 0.5310 0.8431 0.9535 0.4759 0.7771
0.0979 18.125 580 0.1586 0.7447 0.8072 0.9453 0.9940 0.6105 0.8171 0.9502 0.5178 0.7661
0.1027 18.75 600 0.1482 0.7551 0.8261 0.9482 0.9937 0.6605 0.8242 0.9544 0.5341 0.7768
0.1372 19.375 620 0.1504 0.7487 0.8107 0.9463 0.9947 0.6195 0.8180 0.9512 0.5256 0.7693
0.1219 20.0 640 0.1473 0.7558 0.8297 0.9488 0.9944 0.6723 0.8223 0.9556 0.5331 0.7786
0.1045 20.625 660 0.1827 0.7228 0.8050 0.9333 0.9962 0.6811 0.7375 0.9373 0.5231 0.7080
0.1034 21.25 680 0.1534 0.7489 0.8431 0.9451 0.9939 0.7407 0.7946 0.9537 0.5347 0.7583
0.2095 21.875 700 0.1469 0.7401 0.7944 0.9477 0.9935 0.5486 0.8411 0.9543 0.4900 0.7760
0.2314 22.5 720 0.1474 0.7529 0.8222 0.9493 0.9933 0.6387 0.8345 0.9566 0.5206 0.7814
0.1133 23.125 740 0.1645 0.7352 0.8301 0.9404 0.9956 0.7276 0.7671 0.9486 0.5204 0.7366
0.0817 23.75 760 0.1341 0.7563 0.8089 0.9537 0.9936 0.5653 0.8678 0.9612 0.5056 0.8020
0.1422 24.375 780 0.1359 0.7622 0.8287 0.9521 0.9919 0.6404 0.8537 0.9595 0.5333 0.7937
0.0963 25.0 800 0.1450 0.7558 0.8365 0.9487 0.9945 0.6977 0.8172 0.9568 0.5359 0.7748
0.085 25.625 820 0.1543 0.7370 0.7929 0.9441 0.9952 0.5704 0.8132 0.9491 0.5024 0.7596
0.2654 26.25 840 0.1599 0.7338 0.7903 0.9424 0.9957 0.5727 0.8027 0.9472 0.5030 0.7511
0.1506 26.875 860 0.1503 0.7478 0.8091 0.9466 0.9940 0.6096 0.8236 0.9526 0.5227 0.7681
0.114 27.5 880 0.1440 0.7463 0.8029 0.9488 0.9938 0.5738 0.8412 0.9556 0.5037 0.7795
0.0748 28.125 900 0.1611 0.7362 0.7949 0.9416 0.9956 0.5935 0.7957 0.9461 0.5162 0.7463
0.1368 28.75 920 0.1457 0.7496 0.8074 0.9477 0.9948 0.5993 0.8281 0.9535 0.5214 0.7737
0.0836 29.375 940 0.1622 0.7170 0.7663 0.9408 0.9951 0.4931 0.8106 0.9464 0.4579 0.7469
0.215 30.0 960 0.1340 0.7450 0.7959 0.9511 0.9945 0.5366 0.8566 0.9588 0.4857 0.7905
0.1397 30.625 980 0.1370 0.7417 0.7931 0.9495 0.9945 0.5367 0.8480 0.9568 0.4853 0.7830
0.1468 31.25 1000 0.1373 0.7591 0.8221 0.9507 0.9944 0.6332 0.8387 0.9574 0.5337 0.7863
0.0733 31.875 1020 0.1380 0.7439 0.7946 0.9500 0.9943 0.5380 0.8515 0.9573 0.4894 0.7851
0.1454 32.5 1040 0.1414 0.7522 0.8106 0.9487 0.9947 0.6048 0.8324 0.9548 0.5235 0.7781
0.1203 33.125 1060 0.1459 0.7498 0.8197 0.9467 0.9952 0.6512 0.8126 0.9531 0.5282 0.7681
0.2697 33.75 1080 0.1381 0.7541 0.8128 0.9494 0.9944 0.6080 0.8359 0.9554 0.5256 0.7815
0.0884 34.375 1100 0.1629 0.7276 0.7795 0.9403 0.9961 0.5474 0.7950 0.9443 0.4961 0.7426
0.1911 35.0 1120 0.1395 0.7585 0.8293 0.9501 0.9935 0.6603 0.8341 0.9574 0.5354 0.7827
0.129 35.625 1140 0.1709 0.7278 0.7804 0.9399 0.9955 0.5512 0.7946 0.9440 0.4991 0.7402
0.0965 36.25 1160 0.1409 0.7472 0.7995 0.9490 0.9948 0.5633 0.8403 0.9552 0.5057 0.7806
0.0956 36.875 1180 0.1403 0.7389 0.7885 0.9489 0.9942 0.5223 0.8491 0.9559 0.4793 0.7817
0.1023 37.5 1200 0.1512 0.7438 0.8022 0.9447 0.9952 0.6001 0.8113 0.9495 0.5208 0.7610
0.1341 38.125 1220 0.1527 0.7422 0.7961 0.9450 0.9953 0.5771 0.8159 0.9496 0.5143 0.7627
0.0669 38.75 1240 0.1340 0.7459 0.7950 0.9516 0.9938 0.5282 0.8630 0.9591 0.4851 0.7933
0.2038 39.375 1260 0.1347 0.7572 0.8120 0.9509 0.9946 0.5967 0.8446 0.9568 0.5269 0.7879
0.1287 40.0 1280 0.1554 0.7415 0.7959 0.9433 0.9957 0.5872 0.8046 0.9471 0.5224 0.7550
0.1505 40.625 1300 0.1353 0.7470 0.7972 0.9499 0.9946 0.5491 0.8477 0.9563 0.4997 0.7851
0.0827 41.25 1320 0.1408 0.7522 0.8089 0.9481 0.9947 0.6021 0.8300 0.9535 0.5274 0.7757
0.1537 41.875 1340 0.1469 0.7468 0.8007 0.9458 0.9950 0.5874 0.8196 0.9504 0.5244 0.7656
0.1328 42.5 1360 0.1415 0.7490 0.8030 0.9477 0.9948 0.5835 0.8306 0.9531 0.5199 0.7741
0.0971 43.125 1380 0.1330 0.7578 0.8133 0.9515 0.9943 0.5966 0.8489 0.9580 0.5252 0.7903
0.1021 43.75 1400 0.1332 0.7520 0.8033 0.9512 0.9945 0.5634 0.8521 0.9579 0.5086 0.7894
0.0707 44.375 1420 0.1404 0.7496 0.8029 0.9481 0.9952 0.5825 0.8311 0.9536 0.5201 0.7753
0.0767 45.0 1440 0.1388 0.7520 0.8066 0.9486 0.9949 0.5916 0.8332 0.9543 0.5243 0.7775
0.0747 45.625 1460 0.1351 0.7594 0.8200 0.9504 0.9944 0.6274 0.8383 0.9567 0.5369 0.7846
0.2155 46.25 1480 0.1413 0.7549 0.8147 0.9478 0.9949 0.6254 0.8237 0.9531 0.5384 0.7732
0.0757 46.875 1500 0.1379 0.7560 0.8147 0.9495 0.9944 0.6137 0.8359 0.9555 0.5314 0.7810
0.1457 47.5 1520 0.1528 0.7459 0.8057 0.9441 0.9955 0.6174 0.8042 0.9484 0.5323 0.7570
0.0952 48.125 1540 0.1542 0.7467 0.8072 0.9438 0.9955 0.6246 0.8015 0.9479 0.5368 0.7556
0.1606 48.75 1560 0.1465 0.7526 0.8136 0.9464 0.9950 0.6303 0.8154 0.9512 0.5393 0.7672
0.1153 49.375 1580 0.1411 0.7511 0.8063 0.9483 0.9946 0.5916 0.8328 0.9539 0.5225 0.7768
0.065 50.0 1600 0.1459 0.7491 0.8053 0.9465 0.9950 0.5999 0.8210 0.9514 0.5271 0.7687

Framework versions

  • Transformers 4.49.0
  • Pytorch 2.6.0+cu124
  • Datasets 3.3.2
  • Tokenizers 0.21.0
Downloads last month
1
Safetensors
Model size
3.72M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for mujerry/mit-b0-03-04-25-15-21_necrosis

Base model

nvidia/mit-b0
Finetuned
(389)
this model