segformer-finetuned-tt-1000-2k
This model is a fine-tuned version of nvidia/mit-b0 on the Saumya-Mundra/text255 dataset. It achieves the following results on the evaluation set:
- Loss: 0.1042
- Mean Iou: 0.4902
- Mean Accuracy: 0.9804
- Overall Accuracy: 0.9804
- Accuracy Text: nan
- Accuracy No Text: 0.9804
- Iou Text: 0.0
- Iou No Text: 0.9804
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-07
- train_batch_size: 8
- eval_batch_size: 8
- seed: 1337
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: polynomial
- training_steps: 20000
Training results
Training Loss | Epoch | Step | Accuracy No Text | Accuracy Text | Iou No Text | Iou Text | Validation Loss | Mean Accuracy | Mean Iou | Overall Accuracy |
---|---|---|---|---|---|---|---|---|---|---|
0.3719 | 1.0 | 125 | 0.9684 | nan | 0.9684 | 0.0 | 0.1986 | 0.9684 | 0.4842 | 0.9684 |
0.2348 | 2.0 | 250 | 0.9864 | nan | 0.9864 | 0.0 | 0.1336 | 0.9864 | 0.4932 | 0.9864 |
0.183 | 3.0 | 375 | 0.9747 | nan | 0.9747 | 0.0 | 0.1268 | 0.9747 | 0.4874 | 0.9747 |
0.1485 | 4.0 | 500 | 0.9802 | nan | 0.9802 | 0.0 | 0.1114 | 0.9802 | 0.4901 | 0.9802 |
0.1429 | 5.0 | 625 | 0.9757 | nan | 0.9757 | 0.0 | 0.1122 | 0.9757 | 0.4878 | 0.9757 |
0.1367 | 6.0 | 750 | 0.9834 | nan | 0.9834 | 0.0 | 0.1075 | 0.9834 | 0.4917 | 0.9834 |
0.1333 | 7.0 | 875 | 0.9793 | nan | 0.9793 | 0.0 | 0.1048 | 0.9793 | 0.4897 | 0.9793 |
0.1199 | 8.0 | 1000 | 0.9776 | nan | 0.9776 | 0.0 | 0.1009 | 0.9776 | 0.4888 | 0.9776 |
0.1201 | 9.0 | 1125 | 0.9806 | nan | 0.9806 | 0.0 | 0.1000 | 0.9806 | 0.4903 | 0.9806 |
0.1111 | 10.0 | 1250 | 0.9807 | nan | 0.9807 | 0.0 | 0.0998 | 0.9807 | 0.4904 | 0.9807 |
0.1128 | 11.0 | 1375 | 0.9792 | nan | 0.9792 | 0.0 | 0.0984 | 0.9792 | 0.4896 | 0.9792 |
0.1055 | 12.0 | 1500 | 0.9835 | nan | 0.9835 | 0.0 | 0.0941 | 0.9835 | 0.4918 | 0.9835 |
0.0988 | 13.0 | 1625 | 0.9815 | nan | 0.9815 | 0.0 | 0.0972 | 0.9815 | 0.4907 | 0.9815 |
0.0983 | 14.0 | 1750 | 0.9843 | nan | 0.9843 | 0.0 | 0.0947 | 0.9843 | 0.4921 | 0.9843 |
0.1045 | 15.0 | 1875 | 0.9794 | nan | 0.9794 | 0.0 | 0.0960 | 0.9794 | 0.4897 | 0.9794 |
0.1002 | 16.0 | 2000 | 0.9790 | nan | 0.9790 | 0.0 | 0.0976 | 0.9790 | 0.4895 | 0.9790 |
0.1072 | 17.0 | 2125 | 0.9776 | nan | 0.9776 | 0.0 | 0.1006 | 0.9776 | 0.4888 | 0.9776 |
0.1046 | 18.0 | 2250 | 0.9800 | nan | 0.9800 | 0.0 | 0.0938 | 0.9800 | 0.4900 | 0.9800 |
0.1072 | 19.0 | 2375 | 0.9800 | nan | 0.9800 | 0.0 | 0.0962 | 0.9800 | 0.4900 | 0.9800 |
0.1127 | 20.0 | 2500 | 0.9840 | nan | 0.9840 | 0.0 | 0.0918 | 0.9840 | 0.4920 | 0.9840 |
0.1017 | 21.0 | 2625 | 0.9782 | nan | 0.9782 | 0.0 | 0.0940 | 0.9782 | 0.4891 | 0.9782 |
0.0961 | 22.0 | 2750 | 0.9784 | nan | 0.9784 | 0.0 | 0.0964 | 0.9784 | 0.4892 | 0.9784 |
0.0951 | 23.0 | 2875 | 0.9821 | nan | 0.9821 | 0.0 | 0.0940 | 0.9821 | 0.4910 | 0.9821 |
0.0938 | 24.0 | 3000 | 0.9836 | nan | 0.9836 | 0.0 | 0.1005 | 0.9836 | 0.4918 | 0.9836 |
0.0949 | 25.0 | 3125 | 0.9803 | nan | 0.9803 | 0.0 | 0.1003 | 0.9803 | 0.4901 | 0.9803 |
0.0949 | 26.0 | 3250 | 0.9815 | nan | 0.9815 | 0.0 | 0.1015 | 0.9815 | 0.4908 | 0.9815 |
0.0949 | 27.0 | 3375 | 0.9780 | nan | 0.9780 | 0.0 | 0.0970 | 0.9780 | 0.4890 | 0.9780 |
0.0883 | 28.0 | 3500 | 0.9779 | nan | 0.9779 | 0.0 | 0.0967 | 0.9779 | 0.4890 | 0.9779 |
0.0846 | 29.0 | 3625 | 0.9849 | nan | 0.9849 | 0.0 | 0.0973 | 0.9849 | 0.4924 | 0.9849 |
0.0842 | 30.0 | 3750 | 0.9820 | nan | 0.9820 | 0.0 | 0.0946 | 0.9820 | 0.4910 | 0.9820 |
0.0814 | 31.0 | 3875 | 0.9819 | nan | 0.9819 | 0.0 | 0.0936 | 0.9819 | 0.4909 | 0.9819 |
0.0813 | 32.0 | 4000 | 0.9813 | nan | 0.9813 | 0.0 | 0.0938 | 0.9813 | 0.4906 | 0.9813 |
0.0817 | 33.0 | 4125 | 0.9812 | nan | 0.9812 | 0.0 | 0.0946 | 0.9812 | 0.4906 | 0.9812 |
0.0836 | 34.0 | 4250 | 0.9775 | nan | 0.9775 | 0.0 | 0.0940 | 0.9775 | 0.4888 | 0.9775 |
0.0836 | 35.0 | 4375 | 0.9811 | nan | 0.9811 | 0.0 | 0.0915 | 0.9811 | 0.4906 | 0.9811 |
0.0785 | 36.0 | 4500 | 0.9816 | nan | 0.9816 | 0.0 | 0.0951 | 0.9816 | 0.4908 | 0.9816 |
0.0746 | 37.0 | 4625 | 0.9757 | nan | 0.9757 | 0.0 | 0.0951 | 0.9757 | 0.4879 | 0.9757 |
0.0819 | 38.0 | 4750 | 0.9800 | nan | 0.9800 | 0.0 | 0.0952 | 0.9800 | 0.4900 | 0.9800 |
0.0731 | 39.0 | 4875 | 0.9797 | nan | 0.9797 | 0.0 | 0.0922 | 0.9797 | 0.4899 | 0.9797 |
0.0745 | 40.0 | 5000 | 0.9798 | nan | 0.9798 | 0.0 | 0.0939 | 0.9798 | 0.4899 | 0.9798 |
0.0755 | 41.0 | 5125 | 0.9802 | nan | 0.9802 | 0.0 | 0.0946 | 0.9802 | 0.4901 | 0.9802 |
0.0692 | 42.0 | 5250 | 0.9757 | nan | 0.9757 | 0.0 | 0.0976 | 0.9757 | 0.4879 | 0.9757 |
0.0798 | 43.0 | 5375 | 0.9804 | nan | 0.9804 | 0.0 | 0.0988 | 0.9804 | 0.4902 | 0.9804 |
0.076 | 44.0 | 5500 | 0.9798 | nan | 0.9798 | 0.0 | 0.0965 | 0.9798 | 0.4899 | 0.9798 |
0.0757 | 45.0 | 5625 | 0.9823 | nan | 0.9823 | 0.0 | 0.0914 | 0.9823 | 0.4912 | 0.9823 |
0.0702 | 46.0 | 5750 | 0.9781 | nan | 0.9781 | 0.0 | 0.0935 | 0.9781 | 0.4890 | 0.9781 |
0.0765 | 47.0 | 5875 | 0.9809 | nan | 0.9809 | 0.0 | 0.0966 | 0.9809 | 0.4905 | 0.9809 |
0.0724 | 48.0 | 6000 | 0.9833 | nan | 0.9833 | 0.0 | 0.0937 | 0.9833 | 0.4916 | 0.9833 |
0.0713 | 49.0 | 6125 | 0.9762 | nan | 0.9762 | 0.0 | 0.1017 | 0.9762 | 0.4881 | 0.9762 |
0.0677 | 50.0 | 6250 | 0.9804 | nan | 0.9804 | 0.0 | 0.0932 | 0.9804 | 0.4902 | 0.9804 |
0.0715 | 51.0 | 6375 | 0.9781 | nan | 0.9781 | 0.0 | 0.0975 | 0.9781 | 0.4891 | 0.9781 |
0.0713 | 52.0 | 6500 | 0.9833 | nan | 0.9833 | 0.0 | 0.0945 | 0.9833 | 0.4917 | 0.9833 |
0.0695 | 53.0 | 6625 | 0.9819 | nan | 0.9819 | 0.0 | 0.0951 | 0.9819 | 0.4910 | 0.9819 |
0.0648 | 54.0 | 6750 | 0.9825 | nan | 0.9825 | 0.0 | 0.0965 | 0.9825 | 0.4912 | 0.9825 |
0.0694 | 55.0 | 6875 | 0.9809 | nan | 0.9809 | 0.0 | 0.0946 | 0.9809 | 0.4905 | 0.9809 |
0.0665 | 56.0 | 7000 | 0.9824 | nan | 0.9824 | 0.0 | 0.1007 | 0.9824 | 0.4912 | 0.9824 |
0.0635 | 57.0 | 7125 | 0.9831 | nan | 0.9831 | 0.0 | 0.0971 | 0.9831 | 0.4916 | 0.9831 |
0.0628 | 58.0 | 7250 | 0.9785 | nan | 0.9785 | 0.0 | 0.1002 | 0.9785 | 0.4893 | 0.9785 |
0.0668 | 59.0 | 7375 | 0.9813 | nan | 0.9813 | 0.0 | 0.0960 | 0.9813 | 0.4906 | 0.9813 |
0.0648 | 60.0 | 7500 | 0.9796 | nan | 0.9796 | 0.0 | 0.0939 | 0.9796 | 0.4898 | 0.9796 |
0.064 | 61.0 | 7625 | 0.9786 | nan | 0.9786 | 0.0 | 0.0947 | 0.9786 | 0.4893 | 0.9786 |
0.0636 | 62.0 | 7750 | 0.9788 | nan | 0.9788 | 0.0 | 0.0985 | 0.9788 | 0.4894 | 0.9788 |
0.0653 | 63.0 | 7875 | 0.9812 | nan | 0.9812 | 0.0 | 0.0914 | 0.9812 | 0.4906 | 0.9812 |
0.0594 | 64.0 | 8000 | 0.9782 | nan | 0.9782 | 0.0 | 0.0966 | 0.9782 | 0.4891 | 0.9782 |
0.0608 | 65.0 | 8125 | 0.9794 | nan | 0.9794 | 0.0 | 0.0961 | 0.9794 | 0.4897 | 0.9794 |
0.0625 | 66.0 | 8250 | 0.9814 | nan | 0.9814 | 0.0 | 0.0954 | 0.9814 | 0.4907 | 0.9814 |
0.0646 | 67.0 | 8375 | 0.9801 | nan | 0.9801 | 0.0 | 0.0981 | 0.9801 | 0.4900 | 0.9801 |
0.0634 | 68.0 | 8500 | 0.9823 | nan | 0.9823 | 0.0 | 0.0996 | 0.9823 | 0.4911 | 0.9823 |
0.0611 | 69.0 | 8625 | 0.9810 | nan | 0.9810 | 0.0 | 0.1007 | 0.9810 | 0.4905 | 0.9810 |
0.0599 | 70.0 | 8750 | 0.9793 | nan | 0.9793 | 0.0 | 0.0929 | 0.9793 | 0.4896 | 0.9793 |
0.0583 | 71.0 | 8875 | 0.9825 | nan | 0.9825 | 0.0 | 0.0988 | 0.9825 | 0.4913 | 0.9825 |
0.0596 | 72.0 | 9000 | 0.9790 | nan | 0.9790 | 0.0 | 0.0955 | 0.9790 | 0.4895 | 0.9790 |
0.0598 | 73.0 | 9125 | 0.9800 | nan | 0.9800 | 0.0 | 0.1025 | 0.9800 | 0.4900 | 0.9800 |
0.0623 | 74.0 | 9250 | 0.9836 | nan | 0.9836 | 0.0 | 0.0997 | 0.9836 | 0.4918 | 0.9836 |
0.0637 | 75.0 | 9375 | 0.9782 | nan | 0.9782 | 0.0 | 0.0971 | 0.9782 | 0.4891 | 0.9782 |
0.0627 | 76.0 | 9500 | 0.9806 | nan | 0.9806 | 0.0 | 0.0934 | 0.9806 | 0.4903 | 0.9806 |
0.0566 | 77.0 | 9625 | 0.9830 | nan | 0.9830 | 0.0 | 0.1016 | 0.9830 | 0.4915 | 0.9830 |
0.0585 | 78.0 | 9750 | 0.9817 | nan | 0.9817 | 0.0 | 0.0915 | 0.9817 | 0.4908 | 0.9817 |
0.0574 | 79.0 | 9875 | 0.9814 | nan | 0.9814 | 0.0 | 0.0939 | 0.9814 | 0.4907 | 0.9814 |
0.0579 | 80.0 | 10000 | 0.9797 | nan | 0.9797 | 0.0 | 0.0996 | 0.9797 | 0.4899 | 0.9797 |
0.0564 | 81.0 | 10125 | 0.9801 | nan | 0.9801 | 0.0 | 0.0988 | 0.9801 | 0.4901 | 0.9801 |
0.0614 | 82.0 | 10250 | 0.9836 | nan | 0.9836 | 0.0 | 0.1011 | 0.9836 | 0.4918 | 0.9836 |
0.0556 | 83.0 | 10375 | 0.9817 | nan | 0.9817 | 0.0 | 0.0984 | 0.9817 | 0.4908 | 0.9817 |
0.0582 | 84.0 | 10500 | 0.9811 | nan | 0.9811 | 0.0 | 0.0964 | 0.9811 | 0.4906 | 0.9811 |
0.057 | 85.0 | 10625 | 0.9821 | nan | 0.9821 | 0.0 | 0.0956 | 0.9821 | 0.4911 | 0.9821 |
0.0552 | 86.0 | 10750 | 0.9804 | nan | 0.9804 | 0.0 | 0.1000 | 0.9804 | 0.4902 | 0.9804 |
0.059 | 87.0 | 10875 | 0.9828 | nan | 0.9828 | 0.0 | 0.0990 | 0.9828 | 0.4914 | 0.9828 |
0.0547 | 88.0 | 11000 | 0.9811 | nan | 0.9811 | 0.0 | 0.0959 | 0.9811 | 0.4905 | 0.9811 |
0.0532 | 89.0 | 11125 | 0.9819 | nan | 0.9819 | 0.0 | 0.0980 | 0.9819 | 0.4909 | 0.9819 |
0.0578 | 90.0 | 11250 | 0.9829 | nan | 0.9829 | 0.0 | 0.0954 | 0.9829 | 0.4915 | 0.9829 |
0.0552 | 91.0 | 11375 | 0.9817 | nan | 0.9817 | 0.0 | 0.1013 | 0.9817 | 0.4909 | 0.9817 |
0.0584 | 92.0 | 11500 | 0.9802 | nan | 0.9802 | 0.0 | 0.0986 | 0.9802 | 0.4901 | 0.9802 |
0.0528 | 93.0 | 11625 | 0.9806 | nan | 0.9806 | 0.0 | 0.1009 | 0.9806 | 0.4903 | 0.9806 |
0.0566 | 94.0 | 11750 | 0.9802 | nan | 0.9802 | 0.0 | 0.0983 | 0.9802 | 0.4901 | 0.9802 |
0.0541 | 95.0 | 11875 | 0.9806 | nan | 0.9806 | 0.0 | 0.1032 | 0.9806 | 0.4903 | 0.9806 |
0.0577 | 96.0 | 12000 | 0.9800 | nan | 0.9800 | 0.0 | 0.1030 | 0.9800 | 0.4900 | 0.9800 |
0.0567 | 97.0 | 12125 | 0.9796 | nan | 0.9796 | 0.0 | 0.1039 | 0.9796 | 0.4898 | 0.9796 |
0.056 | 98.0 | 12250 | 0.9789 | nan | 0.9789 | 0.0 | 0.1020 | 0.9789 | 0.4894 | 0.9789 |
0.0517 | 99.0 | 12375 | 0.9819 | nan | 0.9819 | 0.0 | 0.1004 | 0.9819 | 0.4910 | 0.9819 |
0.051 | 100.0 | 12500 | 0.9826 | nan | 0.9826 | 0.0 | 0.0990 | 0.9826 | 0.4913 | 0.9826 |
0.0523 | 101.0 | 12625 | 0.9826 | nan | 0.9826 | 0.0 | 0.0984 | 0.9826 | 0.4913 | 0.9826 |
0.0521 | 102.0 | 12750 | 0.9799 | nan | 0.9799 | 0.0 | 0.0987 | 0.9799 | 0.4900 | 0.9799 |
0.0518 | 103.0 | 12875 | 0.9819 | nan | 0.9819 | 0.0 | 0.1065 | 0.9819 | 0.4909 | 0.9819 |
0.0521 | 104.0 | 13000 | 0.9809 | nan | 0.9809 | 0.0 | 0.1052 | 0.9809 | 0.4904 | 0.9809 |
0.0556 | 105.0 | 13125 | 0.9818 | nan | 0.9818 | 0.0 | 0.1006 | 0.9818 | 0.4909 | 0.9818 |
0.0544 | 106.0 | 13250 | 0.9809 | nan | 0.9809 | 0.0 | 0.1045 | 0.9809 | 0.4904 | 0.9809 |
0.0549 | 107.0 | 13375 | 0.9823 | nan | 0.9823 | 0.0 | 0.1014 | 0.9823 | 0.4912 | 0.9823 |
0.054 | 108.0 | 13500 | 0.9809 | nan | 0.9809 | 0.0 | 0.1026 | 0.9809 | 0.4904 | 0.9809 |
0.0526 | 109.0 | 13625 | 0.9837 | nan | 0.9837 | 0.0 | 0.1052 | 0.9837 | 0.4918 | 0.9837 |
0.0524 | 110.0 | 13750 | 0.9830 | nan | 0.9830 | 0.0 | 0.0987 | 0.9830 | 0.4915 | 0.9830 |
0.0487 | 111.0 | 13875 | 0.1028 | 0.4900 | 0.9801 | 0.9801 | nan | 0.9801 | 0.0 | 0.9801 |
0.054 | 112.0 | 14000 | 0.1070 | 0.4915 | 0.9829 | 0.9829 | nan | 0.9829 | 0.0 | 0.9829 |
0.0531 | 113.0 | 14125 | 0.1046 | 0.4903 | 0.9806 | 0.9806 | nan | 0.9806 | 0.0 | 0.9806 |
0.0478 | 114.0 | 14250 | 0.1036 | 0.4915 | 0.9831 | 0.9831 | nan | 0.9831 | 0.0 | 0.9831 |
0.0511 | 115.0 | 14375 | 0.1040 | 0.4904 | 0.9807 | 0.9807 | nan | 0.9807 | 0.0 | 0.9807 |
0.05 | 116.0 | 14500 | 0.1038 | 0.4913 | 0.9826 | 0.9826 | nan | 0.9826 | 0.0 | 0.9826 |
0.0522 | 117.0 | 14625 | 0.1051 | 0.4907 | 0.9814 | 0.9814 | nan | 0.9814 | 0.0 | 0.9814 |
0.0492 | 118.0 | 14750 | 0.1012 | 0.4908 | 0.9817 | 0.9817 | nan | 0.9817 | 0.0 | 0.9817 |
0.0526 | 119.0 | 14875 | 0.1041 | 0.4905 | 0.9811 | 0.9811 | nan | 0.9811 | 0.0 | 0.9811 |
0.0483 | 120.0 | 15000 | 0.1048 | 0.4918 | 0.9836 | 0.9836 | nan | 0.9836 | 0.0 | 0.9836 |
0.0496 | 121.0 | 15125 | 0.1067 | 0.4904 | 0.9807 | 0.9807 | nan | 0.9807 | 0.0 | 0.9807 |
0.0486 | 122.0 | 15250 | 0.1090 | 0.4900 | 0.9799 | 0.9799 | nan | 0.9799 | 0.0 | 0.9799 |
0.0539 | 123.0 | 15375 | 0.1029 | 0.4898 | 0.9797 | 0.9797 | nan | 0.9797 | 0.0 | 0.9797 |
0.0507 | 124.0 | 15500 | 0.1043 | 0.4902 | 0.9804 | 0.9804 | nan | 0.9804 | 0.0 | 0.9804 |
0.0482 | 125.0 | 15625 | 0.1064 | 0.4896 | 0.9791 | 0.9791 | nan | 0.9791 | 0.0 | 0.9791 |
0.0487 | 126.0 | 15750 | 0.1070 | 0.4907 | 0.9813 | 0.9813 | nan | 0.9813 | 0.0 | 0.9813 |
0.0492 | 127.0 | 15875 | 0.1101 | 0.4918 | 0.9836 | 0.9836 | nan | 0.9836 | 0.0 | 0.9836 |
0.0479 | 128.0 | 16000 | 0.1045 | 0.4900 | 0.9800 | 0.9800 | nan | 0.9800 | 0.0 | 0.9800 |
0.0514 | 129.0 | 16125 | 0.1043 | 0.4910 | 0.9820 | 0.9820 | nan | 0.9820 | 0.0 | 0.9820 |
0.0505 | 130.0 | 16250 | 0.1070 | 0.4911 | 0.9821 | 0.9821 | nan | 0.9821 | 0.0 | 0.9821 |
0.0491 | 131.0 | 16375 | 0.1019 | 0.4905 | 0.9811 | 0.9811 | nan | 0.9811 | 0.0 | 0.9811 |
0.0477 | 132.0 | 16500 | 0.1009 | 0.4904 | 0.9808 | 0.9808 | nan | 0.9808 | 0.0 | 0.9808 |
0.0476 | 133.0 | 16625 | 0.1015 | 0.4909 | 0.9818 | 0.9818 | nan | 0.9818 | 0.0 | 0.9818 |
0.0462 | 134.0 | 16750 | 0.1060 | 0.4902 | 0.9804 | 0.9804 | nan | 0.9804 | 0.0 | 0.9804 |
0.0485 | 135.0 | 16875 | 0.1018 | 0.4898 | 0.9795 | 0.9795 | nan | 0.9795 | 0.0 | 0.9795 |
0.0483 | 136.0 | 17000 | 0.1056 | 0.4898 | 0.9796 | 0.9796 | nan | 0.9796 | 0.0 | 0.9796 |
0.0503 | 137.0 | 17125 | 0.1044 | 0.4910 | 0.9820 | 0.9820 | nan | 0.9820 | 0.0 | 0.9820 |
0.0514 | 138.0 | 17250 | 0.1053 | 0.4906 | 0.9813 | 0.9813 | nan | 0.9813 | 0.0 | 0.9813 |
0.0446 | 139.0 | 17375 | 0.1051 | 0.4904 | 0.9808 | 0.9808 | nan | 0.9808 | 0.0 | 0.9808 |
0.047 | 140.0 | 17500 | 0.1071 | 0.4903 | 0.9807 | 0.9807 | nan | 0.9807 | 0.0 | 0.9807 |
0.0467 | 141.0 | 17625 | 0.1085 | 0.4914 | 0.9828 | 0.9828 | nan | 0.9828 | 0.0 | 0.9828 |
0.0476 | 142.0 | 17750 | 0.1077 | 0.4916 | 0.9832 | 0.9832 | nan | 0.9832 | 0.0 | 0.9832 |
0.0472 | 143.0 | 17875 | 0.1122 | 0.4909 | 0.9818 | 0.9818 | nan | 0.9818 | 0.0 | 0.9818 |
0.0477 | 144.0 | 18000 | 0.1043 | 0.4904 | 0.9808 | 0.9808 | nan | 0.9808 | 0.0 | 0.9808 |
0.0467 | 145.0 | 18125 | 0.1051 | 0.4898 | 0.9797 | 0.9797 | nan | 0.9797 | 0.0 | 0.9797 |
0.0493 | 146.0 | 18250 | 0.1049 | 0.4897 | 0.9795 | 0.9795 | nan | 0.9795 | 0.0 | 0.9795 |
0.0485 | 147.0 | 18375 | 0.1059 | 0.4905 | 0.9810 | 0.9810 | nan | 0.9810 | 0.0 | 0.9810 |
0.0462 | 148.0 | 18500 | 0.1057 | 0.4893 | 0.9787 | 0.9787 | nan | 0.9787 | 0.0 | 0.9787 |
0.0474 | 149.0 | 18625 | 0.1037 | 0.4900 | 0.9800 | 0.9800 | nan | 0.9800 | 0.0 | 0.9800 |
0.0506 | 150.0 | 18750 | 0.1052 | 0.4907 | 0.9814 | 0.9814 | nan | 0.9814 | 0.0 | 0.9814 |
0.0479 | 151.0 | 18875 | 0.1069 | 0.4903 | 0.9805 | 0.9805 | nan | 0.9805 | 0.0 | 0.9805 |
0.0439 | 152.0 | 19000 | 0.1080 | 0.4908 | 0.9816 | 0.9816 | nan | 0.9816 | 0.0 | 0.9816 |
0.0492 | 153.0 | 19125 | 0.1019 | 0.4904 | 0.9808 | 0.9808 | nan | 0.9808 | 0.0 | 0.9808 |
0.0442 | 154.0 | 19250 | 0.1053 | 0.4910 | 0.9821 | 0.9821 | nan | 0.9821 | 0.0 | 0.9821 |
0.0484 | 155.0 | 19375 | 0.1032 | 0.4909 | 0.9819 | 0.9819 | nan | 0.9819 | 0.0 | 0.9819 |
0.0466 | 156.0 | 19500 | 0.1039 | 0.4906 | 0.9812 | 0.9812 | nan | 0.9812 | 0.0 | 0.9812 |
0.0444 | 157.0 | 19625 | 0.1038 | 0.4904 | 0.9809 | 0.9809 | nan | 0.9809 | 0.0 | 0.9809 |
0.0463 | 158.0 | 19750 | 0.1038 | 0.4907 | 0.9814 | 0.9814 | nan | 0.9814 | 0.0 | 0.9814 |
0.0465 | 159.0 | 19875 | 0.1054 | 0.4907 | 0.9815 | 0.9815 | nan | 0.9815 | 0.0 | 0.9815 |
0.046 | 160.0 | 20000 | 0.1042 | 0.4902 | 0.9804 | 0.9804 | nan | 0.9804 | 0.0 | 0.9804 |
Framework versions
- Transformers 4.49.0.dev0
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 23,061
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for Saumya-Mundra/segformer-finetuned-tt-1000-2k
Base model
nvidia/mit-b0