Saumya-Mundra commited on
Commit
698510f
·
verified ·
1 Parent(s): 53c69bd

Model save

Browse files
Files changed (1) hide show
  1. README.md +171 -29
README.md CHANGED
@@ -3,8 +3,6 @@ library_name: transformers
3
  license: other
4
  base_model: nvidia/mit-b0
5
  tags:
6
- - image-segmentation
7
- - vision
8
  - generated_from_trainer
9
  model-index:
10
  - name: segformer-finetuned-tt-1000-2k
@@ -16,16 +14,16 @@ should probably proofread and complete it, then remove this comment. -->
16
 
17
  # segformer-finetuned-tt-1000-2k
18
 
19
- This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the Saumya-Mundra/text255 dataset.
20
  It achieves the following results on the evaluation set:
21
- - Loss: 0.0976
22
- - Mean Iou: 0.4895
23
- - Mean Accuracy: 0.9790
24
- - Overall Accuracy: 0.9790
25
  - Accuracy Text: nan
26
- - Accuracy No Text: 0.9790
27
  - Iou Text: 0.0
28
- - Iou No Text: 0.9790
29
 
30
  ## Model description
31
 
@@ -44,34 +42,178 @@ More information needed
44
  ### Training hyperparameters
45
 
46
  The following hyperparameters were used during training:
47
- - learning_rate: 6e-05
48
  - train_batch_size: 8
49
  - eval_batch_size: 8
50
  - seed: 1337
51
  - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
52
  - lr_scheduler_type: polynomial
53
- - training_steps: 2000
54
 
55
  ### Training results
56
 
57
- | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Text | Accuracy No Text | Iou Text | Iou No Text |
58
- |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------:|:----------------:|:--------:|:-----------:|
59
- | 0.3719 | 1.0 | 125 | 0.1986 | 0.4842 | 0.9684 | 0.9684 | nan | 0.9684 | 0.0 | 0.9684 |
60
- | 0.2348 | 2.0 | 250 | 0.1336 | 0.4932 | 0.9864 | 0.9864 | nan | 0.9864 | 0.0 | 0.9864 |
61
- | 0.183 | 3.0 | 375 | 0.1268 | 0.4874 | 0.9747 | 0.9747 | nan | 0.9747 | 0.0 | 0.9747 |
62
- | 0.1485 | 4.0 | 500 | 0.1114 | 0.4901 | 0.9802 | 0.9802 | nan | 0.9802 | 0.0 | 0.9802 |
63
- | 0.1429 | 5.0 | 625 | 0.1122 | 0.4878 | 0.9757 | 0.9757 | nan | 0.9757 | 0.0 | 0.9757 |
64
- | 0.1367 | 6.0 | 750 | 0.1075 | 0.4917 | 0.9834 | 0.9834 | nan | 0.9834 | 0.0 | 0.9834 |
65
- | 0.1333 | 7.0 | 875 | 0.1048 | 0.4897 | 0.9793 | 0.9793 | nan | 0.9793 | 0.0 | 0.9793 |
66
- | 0.1199 | 8.0 | 1000 | 0.1009 | 0.4888 | 0.9776 | 0.9776 | nan | 0.9776 | 0.0 | 0.9776 |
67
- | 0.1201 | 9.0 | 1125 | 0.1000 | 0.4903 | 0.9806 | 0.9806 | nan | 0.9806 | 0.0 | 0.9806 |
68
- | 0.1111 | 10.0 | 1250 | 0.0998 | 0.4904 | 0.9807 | 0.9807 | nan | 0.9807 | 0.0 | 0.9807 |
69
- | 0.1128 | 11.0 | 1375 | 0.0984 | 0.4896 | 0.9792 | 0.9792 | nan | 0.9792 | 0.0 | 0.9792 |
70
- | 0.1055 | 12.0 | 1500 | 0.0941 | 0.4918 | 0.9835 | 0.9835 | nan | 0.9835 | 0.0 | 0.9835 |
71
- | 0.0988 | 13.0 | 1625 | 0.0972 | 0.4907 | 0.9815 | 0.9815 | nan | 0.9815 | 0.0 | 0.9815 |
72
- | 0.0983 | 14.0 | 1750 | 0.0947 | 0.4921 | 0.9843 | 0.9843 | nan | 0.9843 | 0.0 | 0.9843 |
73
- | 0.1045 | 15.0 | 1875 | 0.0960 | 0.4897 | 0.9794 | 0.9794 | nan | 0.9794 | 0.0 | 0.9794 |
74
- | 0.1002 | 16.0 | 2000 | 0.0976 | 0.4895 | 0.9790 | 0.9790 | nan | 0.9790 | 0.0 | 0.9790 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
75
 
76
 
77
  ### Framework versions
 
3
  license: other
4
  base_model: nvidia/mit-b0
5
  tags:
 
 
6
  - generated_from_trainer
7
  model-index:
8
  - name: segformer-finetuned-tt-1000-2k
 
14
 
15
  # segformer-finetuned-tt-1000-2k
16
 
17
+ This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on an unknown dataset.
18
  It achieves the following results on the evaluation set:
19
+ - Loss: 0.1042
20
+ - Mean Iou: 0.4902
21
+ - Mean Accuracy: 0.9804
22
+ - Overall Accuracy: 0.9804
23
  - Accuracy Text: nan
24
+ - Accuracy No Text: 0.9804
25
  - Iou Text: 0.0
26
+ - Iou No Text: 0.9804
27
 
28
  ## Model description
29
 
 
42
  ### Training hyperparameters
43
 
44
  The following hyperparameters were used during training:
45
+ - learning_rate: 6e-07
46
  - train_batch_size: 8
47
  - eval_batch_size: 8
48
  - seed: 1337
49
  - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
50
  - lr_scheduler_type: polynomial
51
+ - training_steps: 20000
52
 
53
  ### Training results
54
 
55
+ | Training Loss | Epoch | Step | Accuracy No Text | Accuracy Text | Iou No Text | Iou Text | Validation Loss | Mean Accuracy | Mean Iou | Overall Accuracy |
56
+ |:-------------:|:-----:|:-----:|:----------------:|:-------------:|:-----------:|:--------:|:---------------:|:-------------:|:--------:|:----------------:|
57
+ | 0.3719 | 1.0 | 125 | 0.9684 | nan | 0.9684 | 0.0 | 0.1986 | 0.9684 | 0.4842 | 0.9684 |
58
+ | 0.2348 | 2.0 | 250 | 0.9864 | nan | 0.9864 | 0.0 | 0.1336 | 0.9864 | 0.4932 | 0.9864 |
59
+ | 0.183 | 3.0 | 375 | 0.9747 | nan | 0.9747 | 0.0 | 0.1268 | 0.9747 | 0.4874 | 0.9747 |
60
+ | 0.1485 | 4.0 | 500 | 0.9802 | nan | 0.9802 | 0.0 | 0.1114 | 0.9802 | 0.4901 | 0.9802 |
61
+ | 0.1429 | 5.0 | 625 | 0.9757 | nan | 0.9757 | 0.0 | 0.1122 | 0.9757 | 0.4878 | 0.9757 |
62
+ | 0.1367 | 6.0 | 750 | 0.9834 | nan | 0.9834 | 0.0 | 0.1075 | 0.9834 | 0.4917 | 0.9834 |
63
+ | 0.1333 | 7.0 | 875 | 0.9793 | nan | 0.9793 | 0.0 | 0.1048 | 0.9793 | 0.4897 | 0.9793 |
64
+ | 0.1199 | 8.0 | 1000 | 0.9776 | nan | 0.9776 | 0.0 | 0.1009 | 0.9776 | 0.4888 | 0.9776 |
65
+ | 0.1201 | 9.0 | 1125 | 0.9806 | nan | 0.9806 | 0.0 | 0.1000 | 0.9806 | 0.4903 | 0.9806 |
66
+ | 0.1111 | 10.0 | 1250 | 0.9807 | nan | 0.9807 | 0.0 | 0.0998 | 0.9807 | 0.4904 | 0.9807 |
67
+ | 0.1128 | 11.0 | 1375 | 0.9792 | nan | 0.9792 | 0.0 | 0.0984 | 0.9792 | 0.4896 | 0.9792 |
68
+ | 0.1055 | 12.0 | 1500 | 0.9835 | nan | 0.9835 | 0.0 | 0.0941 | 0.9835 | 0.4918 | 0.9835 |
69
+ | 0.0988 | 13.0 | 1625 | 0.9815 | nan | 0.9815 | 0.0 | 0.0972 | 0.9815 | 0.4907 | 0.9815 |
70
+ | 0.0983 | 14.0 | 1750 | 0.9843 | nan | 0.9843 | 0.0 | 0.0947 | 0.9843 | 0.4921 | 0.9843 |
71
+ | 0.1045 | 15.0 | 1875 | 0.9794 | nan | 0.9794 | 0.0 | 0.0960 | 0.9794 | 0.4897 | 0.9794 |
72
+ | 0.1002 | 16.0 | 2000 | 0.9790 | nan | 0.9790 | 0.0 | 0.0976 | 0.9790 | 0.4895 | 0.9790 |
73
+ | 0.1072 | 17.0 | 2125 | 0.9776 | nan | 0.9776 | 0.0 | 0.1006 | 0.9776 | 0.4888 | 0.9776 |
74
+ | 0.1046 | 18.0 | 2250 | 0.9800 | nan | 0.9800 | 0.0 | 0.0938 | 0.9800 | 0.4900 | 0.9800 |
75
+ | 0.1072 | 19.0 | 2375 | 0.9800 | nan | 0.9800 | 0.0 | 0.0962 | 0.9800 | 0.4900 | 0.9800 |
76
+ | 0.1127 | 20.0 | 2500 | 0.9840 | nan | 0.9840 | 0.0 | 0.0918 | 0.9840 | 0.4920 | 0.9840 |
77
+ | 0.1017 | 21.0 | 2625 | 0.9782 | nan | 0.9782 | 0.0 | 0.0940 | 0.9782 | 0.4891 | 0.9782 |
78
+ | 0.0961 | 22.0 | 2750 | 0.9784 | nan | 0.9784 | 0.0 | 0.0964 | 0.9784 | 0.4892 | 0.9784 |
79
+ | 0.0951 | 23.0 | 2875 | 0.9821 | nan | 0.9821 | 0.0 | 0.0940 | 0.9821 | 0.4910 | 0.9821 |
80
+ | 0.0938 | 24.0 | 3000 | 0.9836 | nan | 0.9836 | 0.0 | 0.1005 | 0.9836 | 0.4918 | 0.9836 |
81
+ | 0.0949 | 25.0 | 3125 | 0.9803 | nan | 0.9803 | 0.0 | 0.1003 | 0.9803 | 0.4901 | 0.9803 |
82
+ | 0.0949 | 26.0 | 3250 | 0.9815 | nan | 0.9815 | 0.0 | 0.1015 | 0.9815 | 0.4908 | 0.9815 |
83
+ | 0.0949 | 27.0 | 3375 | 0.9780 | nan | 0.9780 | 0.0 | 0.0970 | 0.9780 | 0.4890 | 0.9780 |
84
+ | 0.0883 | 28.0 | 3500 | 0.9779 | nan | 0.9779 | 0.0 | 0.0967 | 0.9779 | 0.4890 | 0.9779 |
85
+ | 0.0846 | 29.0 | 3625 | 0.9849 | nan | 0.9849 | 0.0 | 0.0973 | 0.9849 | 0.4924 | 0.9849 |
86
+ | 0.0842 | 30.0 | 3750 | 0.9820 | nan | 0.9820 | 0.0 | 0.0946 | 0.9820 | 0.4910 | 0.9820 |
87
+ | 0.0814 | 31.0 | 3875 | 0.9819 | nan | 0.9819 | 0.0 | 0.0936 | 0.9819 | 0.4909 | 0.9819 |
88
+ | 0.0813 | 32.0 | 4000 | 0.9813 | nan | 0.9813 | 0.0 | 0.0938 | 0.9813 | 0.4906 | 0.9813 |
89
+ | 0.0817 | 33.0 | 4125 | 0.9812 | nan | 0.9812 | 0.0 | 0.0946 | 0.9812 | 0.4906 | 0.9812 |
90
+ | 0.0836 | 34.0 | 4250 | 0.9775 | nan | 0.9775 | 0.0 | 0.0940 | 0.9775 | 0.4888 | 0.9775 |
91
+ | 0.0836 | 35.0 | 4375 | 0.9811 | nan | 0.9811 | 0.0 | 0.0915 | 0.9811 | 0.4906 | 0.9811 |
92
+ | 0.0785 | 36.0 | 4500 | 0.9816 | nan | 0.9816 | 0.0 | 0.0951 | 0.9816 | 0.4908 | 0.9816 |
93
+ | 0.0746 | 37.0 | 4625 | 0.9757 | nan | 0.9757 | 0.0 | 0.0951 | 0.9757 | 0.4879 | 0.9757 |
94
+ | 0.0819 | 38.0 | 4750 | 0.9800 | nan | 0.9800 | 0.0 | 0.0952 | 0.9800 | 0.4900 | 0.9800 |
95
+ | 0.0731 | 39.0 | 4875 | 0.9797 | nan | 0.9797 | 0.0 | 0.0922 | 0.9797 | 0.4899 | 0.9797 |
96
+ | 0.0745 | 40.0 | 5000 | 0.9798 | nan | 0.9798 | 0.0 | 0.0939 | 0.9798 | 0.4899 | 0.9798 |
97
+ | 0.0755 | 41.0 | 5125 | 0.9802 | nan | 0.9802 | 0.0 | 0.0946 | 0.9802 | 0.4901 | 0.9802 |
98
+ | 0.0692 | 42.0 | 5250 | 0.9757 | nan | 0.9757 | 0.0 | 0.0976 | 0.9757 | 0.4879 | 0.9757 |
99
+ | 0.0798 | 43.0 | 5375 | 0.9804 | nan | 0.9804 | 0.0 | 0.0988 | 0.9804 | 0.4902 | 0.9804 |
100
+ | 0.076 | 44.0 | 5500 | 0.9798 | nan | 0.9798 | 0.0 | 0.0965 | 0.9798 | 0.4899 | 0.9798 |
101
+ | 0.0757 | 45.0 | 5625 | 0.9823 | nan | 0.9823 | 0.0 | 0.0914 | 0.9823 | 0.4912 | 0.9823 |
102
+ | 0.0702 | 46.0 | 5750 | 0.9781 | nan | 0.9781 | 0.0 | 0.0935 | 0.9781 | 0.4890 | 0.9781 |
103
+ | 0.0765 | 47.0 | 5875 | 0.9809 | nan | 0.9809 | 0.0 | 0.0966 | 0.9809 | 0.4905 | 0.9809 |
104
+ | 0.0724 | 48.0 | 6000 | 0.9833 | nan | 0.9833 | 0.0 | 0.0937 | 0.9833 | 0.4916 | 0.9833 |
105
+ | 0.0713 | 49.0 | 6125 | 0.9762 | nan | 0.9762 | 0.0 | 0.1017 | 0.9762 | 0.4881 | 0.9762 |
106
+ | 0.0677 | 50.0 | 6250 | 0.9804 | nan | 0.9804 | 0.0 | 0.0932 | 0.9804 | 0.4902 | 0.9804 |
107
+ | 0.0715 | 51.0 | 6375 | 0.9781 | nan | 0.9781 | 0.0 | 0.0975 | 0.9781 | 0.4891 | 0.9781 |
108
+ | 0.0713 | 52.0 | 6500 | 0.9833 | nan | 0.9833 | 0.0 | 0.0945 | 0.9833 | 0.4917 | 0.9833 |
109
+ | 0.0695 | 53.0 | 6625 | 0.9819 | nan | 0.9819 | 0.0 | 0.0951 | 0.9819 | 0.4910 | 0.9819 |
110
+ | 0.0648 | 54.0 | 6750 | 0.9825 | nan | 0.9825 | 0.0 | 0.0965 | 0.9825 | 0.4912 | 0.9825 |
111
+ | 0.0694 | 55.0 | 6875 | 0.9809 | nan | 0.9809 | 0.0 | 0.0946 | 0.9809 | 0.4905 | 0.9809 |
112
+ | 0.0665 | 56.0 | 7000 | 0.9824 | nan | 0.9824 | 0.0 | 0.1007 | 0.9824 | 0.4912 | 0.9824 |
113
+ | 0.0635 | 57.0 | 7125 | 0.9831 | nan | 0.9831 | 0.0 | 0.0971 | 0.9831 | 0.4916 | 0.9831 |
114
+ | 0.0628 | 58.0 | 7250 | 0.9785 | nan | 0.9785 | 0.0 | 0.1002 | 0.9785 | 0.4893 | 0.9785 |
115
+ | 0.0668 | 59.0 | 7375 | 0.9813 | nan | 0.9813 | 0.0 | 0.0960 | 0.9813 | 0.4906 | 0.9813 |
116
+ | 0.0648 | 60.0 | 7500 | 0.9796 | nan | 0.9796 | 0.0 | 0.0939 | 0.9796 | 0.4898 | 0.9796 |
117
+ | 0.064 | 61.0 | 7625 | 0.9786 | nan | 0.9786 | 0.0 | 0.0947 | 0.9786 | 0.4893 | 0.9786 |
118
+ | 0.0636 | 62.0 | 7750 | 0.9788 | nan | 0.9788 | 0.0 | 0.0985 | 0.9788 | 0.4894 | 0.9788 |
119
+ | 0.0653 | 63.0 | 7875 | 0.9812 | nan | 0.9812 | 0.0 | 0.0914 | 0.9812 | 0.4906 | 0.9812 |
120
+ | 0.0594 | 64.0 | 8000 | 0.9782 | nan | 0.9782 | 0.0 | 0.0966 | 0.9782 | 0.4891 | 0.9782 |
121
+ | 0.0608 | 65.0 | 8125 | 0.9794 | nan | 0.9794 | 0.0 | 0.0961 | 0.9794 | 0.4897 | 0.9794 |
122
+ | 0.0625 | 66.0 | 8250 | 0.9814 | nan | 0.9814 | 0.0 | 0.0954 | 0.9814 | 0.4907 | 0.9814 |
123
+ | 0.0646 | 67.0 | 8375 | 0.9801 | nan | 0.9801 | 0.0 | 0.0981 | 0.9801 | 0.4900 | 0.9801 |
124
+ | 0.0634 | 68.0 | 8500 | 0.9823 | nan | 0.9823 | 0.0 | 0.0996 | 0.9823 | 0.4911 | 0.9823 |
125
+ | 0.0611 | 69.0 | 8625 | 0.9810 | nan | 0.9810 | 0.0 | 0.1007 | 0.9810 | 0.4905 | 0.9810 |
126
+ | 0.0599 | 70.0 | 8750 | 0.9793 | nan | 0.9793 | 0.0 | 0.0929 | 0.9793 | 0.4896 | 0.9793 |
127
+ | 0.0583 | 71.0 | 8875 | 0.9825 | nan | 0.9825 | 0.0 | 0.0988 | 0.9825 | 0.4913 | 0.9825 |
128
+ | 0.0596 | 72.0 | 9000 | 0.9790 | nan | 0.9790 | 0.0 | 0.0955 | 0.9790 | 0.4895 | 0.9790 |
129
+ | 0.0598 | 73.0 | 9125 | 0.9800 | nan | 0.9800 | 0.0 | 0.1025 | 0.9800 | 0.4900 | 0.9800 |
130
+ | 0.0623 | 74.0 | 9250 | 0.9836 | nan | 0.9836 | 0.0 | 0.0997 | 0.9836 | 0.4918 | 0.9836 |
131
+ | 0.0637 | 75.0 | 9375 | 0.9782 | nan | 0.9782 | 0.0 | 0.0971 | 0.9782 | 0.4891 | 0.9782 |
132
+ | 0.0627 | 76.0 | 9500 | 0.9806 | nan | 0.9806 | 0.0 | 0.0934 | 0.9806 | 0.4903 | 0.9806 |
133
+ | 0.0566 | 77.0 | 9625 | 0.9830 | nan | 0.9830 | 0.0 | 0.1016 | 0.9830 | 0.4915 | 0.9830 |
134
+ | 0.0585 | 78.0 | 9750 | 0.9817 | nan | 0.9817 | 0.0 | 0.0915 | 0.9817 | 0.4908 | 0.9817 |
135
+ | 0.0574 | 79.0 | 9875 | 0.9814 | nan | 0.9814 | 0.0 | 0.0939 | 0.9814 | 0.4907 | 0.9814 |
136
+ | 0.0579 | 80.0 | 10000 | 0.9797 | nan | 0.9797 | 0.0 | 0.0996 | 0.9797 | 0.4899 | 0.9797 |
137
+ | 0.0564 | 81.0 | 10125 | 0.9801 | nan | 0.9801 | 0.0 | 0.0988 | 0.9801 | 0.4901 | 0.9801 |
138
+ | 0.0614 | 82.0 | 10250 | 0.9836 | nan | 0.9836 | 0.0 | 0.1011 | 0.9836 | 0.4918 | 0.9836 |
139
+ | 0.0556 | 83.0 | 10375 | 0.9817 | nan | 0.9817 | 0.0 | 0.0984 | 0.9817 | 0.4908 | 0.9817 |
140
+ | 0.0582 | 84.0 | 10500 | 0.9811 | nan | 0.9811 | 0.0 | 0.0964 | 0.9811 | 0.4906 | 0.9811 |
141
+ | 0.057 | 85.0 | 10625 | 0.9821 | nan | 0.9821 | 0.0 | 0.0956 | 0.9821 | 0.4911 | 0.9821 |
142
+ | 0.0552 | 86.0 | 10750 | 0.9804 | nan | 0.9804 | 0.0 | 0.1000 | 0.9804 | 0.4902 | 0.9804 |
143
+ | 0.059 | 87.0 | 10875 | 0.9828 | nan | 0.9828 | 0.0 | 0.0990 | 0.9828 | 0.4914 | 0.9828 |
144
+ | 0.0547 | 88.0 | 11000 | 0.9811 | nan | 0.9811 | 0.0 | 0.0959 | 0.9811 | 0.4905 | 0.9811 |
145
+ | 0.0532 | 89.0 | 11125 | 0.9819 | nan | 0.9819 | 0.0 | 0.0980 | 0.9819 | 0.4909 | 0.9819 |
146
+ | 0.0578 | 90.0 | 11250 | 0.9829 | nan | 0.9829 | 0.0 | 0.0954 | 0.9829 | 0.4915 | 0.9829 |
147
+ | 0.0552 | 91.0 | 11375 | 0.9817 | nan | 0.9817 | 0.0 | 0.1013 | 0.9817 | 0.4909 | 0.9817 |
148
+ | 0.0584 | 92.0 | 11500 | 0.9802 | nan | 0.9802 | 0.0 | 0.0986 | 0.9802 | 0.4901 | 0.9802 |
149
+ | 0.0528 | 93.0 | 11625 | 0.9806 | nan | 0.9806 | 0.0 | 0.1009 | 0.9806 | 0.4903 | 0.9806 |
150
+ | 0.0566 | 94.0 | 11750 | 0.9802 | nan | 0.9802 | 0.0 | 0.0983 | 0.9802 | 0.4901 | 0.9802 |
151
+ | 0.0541 | 95.0 | 11875 | 0.9806 | nan | 0.9806 | 0.0 | 0.1032 | 0.9806 | 0.4903 | 0.9806 |
152
+ | 0.0577 | 96.0 | 12000 | 0.9800 | nan | 0.9800 | 0.0 | 0.1030 | 0.9800 | 0.4900 | 0.9800 |
153
+ | 0.0567 | 97.0 | 12125 | 0.9796 | nan | 0.9796 | 0.0 | 0.1039 | 0.9796 | 0.4898 | 0.9796 |
154
+ | 0.056 | 98.0 | 12250 | 0.9789 | nan | 0.9789 | 0.0 | 0.1020 | 0.9789 | 0.4894 | 0.9789 |
155
+ | 0.0517 | 99.0 | 12375 | 0.9819 | nan | 0.9819 | 0.0 | 0.1004 | 0.9819 | 0.4910 | 0.9819 |
156
+ | 0.051 | 100.0 | 12500 | 0.9826 | nan | 0.9826 | 0.0 | 0.0990 | 0.9826 | 0.4913 | 0.9826 |
157
+ | 0.0523 | 101.0 | 12625 | 0.9826 | nan | 0.9826 | 0.0 | 0.0984 | 0.9826 | 0.4913 | 0.9826 |
158
+ | 0.0521 | 102.0 | 12750 | 0.9799 | nan | 0.9799 | 0.0 | 0.0987 | 0.9799 | 0.4900 | 0.9799 |
159
+ | 0.0518 | 103.0 | 12875 | 0.9819 | nan | 0.9819 | 0.0 | 0.1065 | 0.9819 | 0.4909 | 0.9819 |
160
+ | 0.0521 | 104.0 | 13000 | 0.9809 | nan | 0.9809 | 0.0 | 0.1052 | 0.9809 | 0.4904 | 0.9809 |
161
+ | 0.0556 | 105.0 | 13125 | 0.9818 | nan | 0.9818 | 0.0 | 0.1006 | 0.9818 | 0.4909 | 0.9818 |
162
+ | 0.0544 | 106.0 | 13250 | 0.9809 | nan | 0.9809 | 0.0 | 0.1045 | 0.9809 | 0.4904 | 0.9809 |
163
+ | 0.0549 | 107.0 | 13375 | 0.9823 | nan | 0.9823 | 0.0 | 0.1014 | 0.9823 | 0.4912 | 0.9823 |
164
+ | 0.054 | 108.0 | 13500 | 0.9809 | nan | 0.9809 | 0.0 | 0.1026 | 0.9809 | 0.4904 | 0.9809 |
165
+ | 0.0526 | 109.0 | 13625 | 0.9837 | nan | 0.9837 | 0.0 | 0.1052 | 0.9837 | 0.4918 | 0.9837 |
166
+ | 0.0524 | 110.0 | 13750 | 0.9830 | nan | 0.9830 | 0.0 | 0.0987 | 0.9830 | 0.4915 | 0.9830 |
167
+ | 0.0487 | 111.0 | 13875 | 0.1028 | 0.4900 | 0.9801 | 0.9801 | nan | 0.9801 | 0.0 | 0.9801 |
168
+ | 0.054 | 112.0 | 14000 | 0.1070 | 0.4915 | 0.9829 | 0.9829 | nan | 0.9829 | 0.0 | 0.9829 |
169
+ | 0.0531 | 113.0 | 14125 | 0.1046 | 0.4903 | 0.9806 | 0.9806 | nan | 0.9806 | 0.0 | 0.9806 |
170
+ | 0.0478 | 114.0 | 14250 | 0.1036 | 0.4915 | 0.9831 | 0.9831 | nan | 0.9831 | 0.0 | 0.9831 |
171
+ | 0.0511 | 115.0 | 14375 | 0.1040 | 0.4904 | 0.9807 | 0.9807 | nan | 0.9807 | 0.0 | 0.9807 |
172
+ | 0.05 | 116.0 | 14500 | 0.1038 | 0.4913 | 0.9826 | 0.9826 | nan | 0.9826 | 0.0 | 0.9826 |
173
+ | 0.0522 | 117.0 | 14625 | 0.1051 | 0.4907 | 0.9814 | 0.9814 | nan | 0.9814 | 0.0 | 0.9814 |
174
+ | 0.0492 | 118.0 | 14750 | 0.1012 | 0.4908 | 0.9817 | 0.9817 | nan | 0.9817 | 0.0 | 0.9817 |
175
+ | 0.0526 | 119.0 | 14875 | 0.1041 | 0.4905 | 0.9811 | 0.9811 | nan | 0.9811 | 0.0 | 0.9811 |
176
+ | 0.0483 | 120.0 | 15000 | 0.1048 | 0.4918 | 0.9836 | 0.9836 | nan | 0.9836 | 0.0 | 0.9836 |
177
+ | 0.0496 | 121.0 | 15125 | 0.1067 | 0.4904 | 0.9807 | 0.9807 | nan | 0.9807 | 0.0 | 0.9807 |
178
+ | 0.0486 | 122.0 | 15250 | 0.1090 | 0.4900 | 0.9799 | 0.9799 | nan | 0.9799 | 0.0 | 0.9799 |
179
+ | 0.0539 | 123.0 | 15375 | 0.1029 | 0.4898 | 0.9797 | 0.9797 | nan | 0.9797 | 0.0 | 0.9797 |
180
+ | 0.0507 | 124.0 | 15500 | 0.1043 | 0.4902 | 0.9804 | 0.9804 | nan | 0.9804 | 0.0 | 0.9804 |
181
+ | 0.0482 | 125.0 | 15625 | 0.1064 | 0.4896 | 0.9791 | 0.9791 | nan | 0.9791 | 0.0 | 0.9791 |
182
+ | 0.0487 | 126.0 | 15750 | 0.1070 | 0.4907 | 0.9813 | 0.9813 | nan | 0.9813 | 0.0 | 0.9813 |
183
+ | 0.0492 | 127.0 | 15875 | 0.1101 | 0.4918 | 0.9836 | 0.9836 | nan | 0.9836 | 0.0 | 0.9836 |
184
+ | 0.0479 | 128.0 | 16000 | 0.1045 | 0.4900 | 0.9800 | 0.9800 | nan | 0.9800 | 0.0 | 0.9800 |
185
+ | 0.0514 | 129.0 | 16125 | 0.1043 | 0.4910 | 0.9820 | 0.9820 | nan | 0.9820 | 0.0 | 0.9820 |
186
+ | 0.0505 | 130.0 | 16250 | 0.1070 | 0.4911 | 0.9821 | 0.9821 | nan | 0.9821 | 0.0 | 0.9821 |
187
+ | 0.0491 | 131.0 | 16375 | 0.1019 | 0.4905 | 0.9811 | 0.9811 | nan | 0.9811 | 0.0 | 0.9811 |
188
+ | 0.0477 | 132.0 | 16500 | 0.1009 | 0.4904 | 0.9808 | 0.9808 | nan | 0.9808 | 0.0 | 0.9808 |
189
+ | 0.0476 | 133.0 | 16625 | 0.1015 | 0.4909 | 0.9818 | 0.9818 | nan | 0.9818 | 0.0 | 0.9818 |
190
+ | 0.0462 | 134.0 | 16750 | 0.1060 | 0.4902 | 0.9804 | 0.9804 | nan | 0.9804 | 0.0 | 0.9804 |
191
+ | 0.0485 | 135.0 | 16875 | 0.1018 | 0.4898 | 0.9795 | 0.9795 | nan | 0.9795 | 0.0 | 0.9795 |
192
+ | 0.0483 | 136.0 | 17000 | 0.1056 | 0.4898 | 0.9796 | 0.9796 | nan | 0.9796 | 0.0 | 0.9796 |
193
+ | 0.0503 | 137.0 | 17125 | 0.1044 | 0.4910 | 0.9820 | 0.9820 | nan | 0.9820 | 0.0 | 0.9820 |
194
+ | 0.0514 | 138.0 | 17250 | 0.1053 | 0.4906 | 0.9813 | 0.9813 | nan | 0.9813 | 0.0 | 0.9813 |
195
+ | 0.0446 | 139.0 | 17375 | 0.1051 | 0.4904 | 0.9808 | 0.9808 | nan | 0.9808 | 0.0 | 0.9808 |
196
+ | 0.047 | 140.0 | 17500 | 0.1071 | 0.4903 | 0.9807 | 0.9807 | nan | 0.9807 | 0.0 | 0.9807 |
197
+ | 0.0467 | 141.0 | 17625 | 0.1085 | 0.4914 | 0.9828 | 0.9828 | nan | 0.9828 | 0.0 | 0.9828 |
198
+ | 0.0476 | 142.0 | 17750 | 0.1077 | 0.4916 | 0.9832 | 0.9832 | nan | 0.9832 | 0.0 | 0.9832 |
199
+ | 0.0472 | 143.0 | 17875 | 0.1122 | 0.4909 | 0.9818 | 0.9818 | nan | 0.9818 | 0.0 | 0.9818 |
200
+ | 0.0477 | 144.0 | 18000 | 0.1043 | 0.4904 | 0.9808 | 0.9808 | nan | 0.9808 | 0.0 | 0.9808 |
201
+ | 0.0467 | 145.0 | 18125 | 0.1051 | 0.4898 | 0.9797 | 0.9797 | nan | 0.9797 | 0.0 | 0.9797 |
202
+ | 0.0493 | 146.0 | 18250 | 0.1049 | 0.4897 | 0.9795 | 0.9795 | nan | 0.9795 | 0.0 | 0.9795 |
203
+ | 0.0485 | 147.0 | 18375 | 0.1059 | 0.4905 | 0.9810 | 0.9810 | nan | 0.9810 | 0.0 | 0.9810 |
204
+ | 0.0462 | 148.0 | 18500 | 0.1057 | 0.4893 | 0.9787 | 0.9787 | nan | 0.9787 | 0.0 | 0.9787 |
205
+ | 0.0474 | 149.0 | 18625 | 0.1037 | 0.4900 | 0.9800 | 0.9800 | nan | 0.9800 | 0.0 | 0.9800 |
206
+ | 0.0506 | 150.0 | 18750 | 0.1052 | 0.4907 | 0.9814 | 0.9814 | nan | 0.9814 | 0.0 | 0.9814 |
207
+ | 0.0479 | 151.0 | 18875 | 0.1069 | 0.4903 | 0.9805 | 0.9805 | nan | 0.9805 | 0.0 | 0.9805 |
208
+ | 0.0439 | 152.0 | 19000 | 0.1080 | 0.4908 | 0.9816 | 0.9816 | nan | 0.9816 | 0.0 | 0.9816 |
209
+ | 0.0492 | 153.0 | 19125 | 0.1019 | 0.4904 | 0.9808 | 0.9808 | nan | 0.9808 | 0.0 | 0.9808 |
210
+ | 0.0442 | 154.0 | 19250 | 0.1053 | 0.4910 | 0.9821 | 0.9821 | nan | 0.9821 | 0.0 | 0.9821 |
211
+ | 0.0484 | 155.0 | 19375 | 0.1032 | 0.4909 | 0.9819 | 0.9819 | nan | 0.9819 | 0.0 | 0.9819 |
212
+ | 0.0466 | 156.0 | 19500 | 0.1039 | 0.4906 | 0.9812 | 0.9812 | nan | 0.9812 | 0.0 | 0.9812 |
213
+ | 0.0444 | 157.0 | 19625 | 0.1038 | 0.4904 | 0.9809 | 0.9809 | nan | 0.9809 | 0.0 | 0.9809 |
214
+ | 0.0463 | 158.0 | 19750 | 0.1038 | 0.4907 | 0.9814 | 0.9814 | nan | 0.9814 | 0.0 | 0.9814 |
215
+ | 0.0465 | 159.0 | 19875 | 0.1054 | 0.4907 | 0.9815 | 0.9815 | nan | 0.9815 | 0.0 | 0.9815 |
216
+ | 0.046 | 160.0 | 20000 | 0.1042 | 0.4902 | 0.9804 | 0.9804 | nan | 0.9804 | 0.0 | 0.9804 |
217
 
218
 
219
  ### Framework versions