Update README.md
Browse filesfeat: Add full documentation and standalone inference script to model card
This commit significantly improves the model card to make the custom model usable by others. The previous README was a default template.
**Key Changes:**
1. **Completed All Sections**: Filled in the `Model description` and `Intended uses & limitations` sections with specifics about the multi-head architecture and its purpose in disaster response.
2. **Added "How to Use" Section**: This is the most critical addition. It contains a complete, self-contained Python script that successfully loads the model and runs inference.
3. **In-line Documentation**: The script is segmented into logical blocks (Architecture, Labels, Setup, Prediction) with clear documentation explaining why each part is necessary. This is crucial for a custom model that cannot use the standard Hugging Face `pipeline`.
4. **Added Sample Output**: A raw JSON output block is included to show users the exact format of the prediction results.
5. **Cleaned Up Metadata**: Minor formatting improvements to the training hyperparameter list.
|
@@ -2,32 +2,47 @@
|
|
| 2 |
library_name: transformers
|
| 3 |
base_model: cardiffnlp/twitter-xlm-roberta-base-sentiment
|
| 4 |
tags:
|
| 5 |
-
-
|
| 6 |
-
-
|
| 7 |
-
-
|
| 8 |
-
- disaster
|
| 9 |
-
-
|
|
|
|
| 10 |
- twitter
|
| 11 |
-
-
|
| 12 |
-
- social
|
| 13 |
-
- media
|
| 14 |
model-index:
|
| 15 |
- name: xlm-roberta-sentiment-requests
|
| 16 |
-
results:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 17 |
datasets:
|
| 18 |
- community-datasets/disaster_response_messages
|
| 19 |
pipeline_tag: text-classification
|
| 20 |
language:
|
| 21 |
-
- multilingual
|
| 22 |
- en
|
|
|
|
| 23 |
---
|
| 24 |
|
| 25 |
-
<!-- This model card has been generated automatically
|
| 26 |
-
should probably proofread and complete it, then remove this comment. -->
|
| 27 |
|
| 28 |
# xlm-roberta-sentiment-requests
|
| 29 |
|
| 30 |
-
This model is a fine-tuned version of [cardiffnlp/twitter-xlm-roberta-base-sentiment](https://huggingface.co/cardiffnlp/twitter-xlm-roberta-base-sentiment) on the community-datasets/disaster_response_messages dataset.
|
| 31 |
It achieves the following results on the evaluation set:
|
| 32 |
- Loss: 0.1465
|
| 33 |
- F1 Micro: 0.7240
|
|
@@ -36,18 +51,221 @@ It achieves the following results on the evaluation set:
|
|
| 36 |
|
| 37 |
## Model description
|
| 38 |
|
| 39 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
| 40 |
|
| 41 |
## Intended uses & limitations
|
| 42 |
|
| 43 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 44 |
|
| 45 |
## Training and evaluation data
|
| 46 |
|
| 47 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 48 |
|
| 49 |
## Training procedure
|
| 50 |
|
|
|
|
|
|
|
| 51 |
### Training hyperparameters
|
| 52 |
|
| 53 |
The following hyperparameters were used during training:
|
|
@@ -57,608 +275,42 @@ The following hyperparameters were used during training:
|
|
| 57 |
- seed: 42
|
| 58 |
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
| 59 |
- lr_scheduler_type: linear
|
| 60 |
-
- num_epochs: 1000
|
| 61 |
- mixed_precision_training: Native AMP
|
| 62 |
|
| 63 |
### Training results
|
| 64 |
|
| 65 |
-
|
| 66 |
-
|
| 67 |
-
| 0.4267 | 1.0 | 658 | 0.2727 | 0.4953 | 0.0722 | 0.1053 |
|
| 68 |
-
| 0.2662 | 2.0 | 1316 | 0.2291 | 0.5446 | 0.0906 | 0.1123 |
|
| 69 |
-
| 0.2366 | 3.0 | 1974 | 0.2143 | 0.5682 | 0.1031 | 0.1279 |
|
| 70 |
-
| 0.2234 | 4.0 | 2632 | 0.2058 | 0.5878 | 0.1160 | 0.1333 |
|
| 71 |
-
| 0.2156 | 5.0 | 3290 | 0.1997 | 0.6022 | 0.1255 | 0.1380 |
|
| 72 |
-
| 0.2094 | 6.0 | 3948 | 0.1949 | 0.6116 | 0.1311 | 0.1438 |
|
| 73 |
-
| 0.2047 | 7.0 | 4606 | 0.1911 | 0.6189 | 0.1372 | 0.1438 |
|
| 74 |
-
| 0.201 | 8.0 | 5264 | 0.1879 | 0.6226 | 0.1415 | 0.1446 |
|
| 75 |
-
| 0.1974 | 9.0 | 5922 | 0.1852 | 0.6283 | 0.1494 | 0.1500 |
|
| 76 |
-
| 0.1953 | 10.0 | 6580 | 0.1829 | 0.6357 | 0.1595 | 0.1524 |
|
| 77 |
-
| 0.1926 | 11.0 | 7238 | 0.1808 | 0.6397 | 0.1677 | 0.1593 |
|
| 78 |
-
| 0.1909 | 12.0 | 7896 | 0.1791 | 0.6419 | 0.1716 | 0.1609 |
|
| 79 |
-
| 0.189 | 13.0 | 8554 | 0.1775 | 0.6459 | 0.1789 | 0.1656 |
|
| 80 |
-
| 0.1877 | 14.0 | 9212 | 0.1762 | 0.6496 | 0.1864 | 0.1675 |
|
| 81 |
-
| 0.1859 | 15.0 | 9870 | 0.1749 | 0.6531 | 0.1920 | 0.1667 |
|
| 82 |
-
| 0.185 | 16.0 | 10528 | 0.1737 | 0.6550 | 0.1974 | 0.1737 |
|
| 83 |
-
| 0.1838 | 17.0 | 11186 | 0.1727 | 0.6580 | 0.2019 | 0.1784 |
|
| 84 |
-
| 0.1824 | 18.0 | 11844 | 0.1719 | 0.6602 | 0.2055 | 0.1749 |
|
| 85 |
-
| 0.1816 | 19.0 | 12502 | 0.1709 | 0.6615 | 0.2089 | 0.1866 |
|
| 86 |
-
| 0.1809 | 20.0 | 13160 | 0.1702 | 0.6630 | 0.2127 | 0.1823 |
|
| 87 |
-
| 0.1799 | 21.0 | 13818 | 0.1695 | 0.6657 | 0.2173 | 0.1877 |
|
| 88 |
-
| 0.1798 | 22.0 | 14476 | 0.1687 | 0.6675 | 0.2215 | 0.1904 |
|
| 89 |
-
| 0.1788 | 23.0 | 15134 | 0.1681 | 0.6690 | 0.2228 | 0.1916 |
|
| 90 |
-
| 0.1778 | 24.0 | 15792 | 0.1677 | 0.6706 | 0.2282 | 0.1904 |
|
| 91 |
-
| 0.1773 | 25.0 | 16450 | 0.1670 | 0.6714 | 0.2305 | 0.1955 |
|
| 92 |
-
| 0.1768 | 26.0 | 17108 | 0.1665 | 0.6727 | 0.2327 | 0.1967 |
|
| 93 |
-
| 0.1766 | 27.0 | 17766 | 0.1660 | 0.6749 | 0.2358 | 0.2033 |
|
| 94 |
-
| 0.1756 | 28.0 | 18424 | 0.1656 | 0.6750 | 0.2368 | 0.2002 |
|
| 95 |
-
| 0.1752 | 29.0 | 19082 | 0.1650 | 0.6752 | 0.2371 | 0.2017 |
|
| 96 |
-
| 0.175 | 30.0 | 19740 | 0.1647 | 0.6777 | 0.2423 | 0.2048 |
|
| 97 |
-
| 0.1748 | 31.0 | 20398 | 0.1643 | 0.6796 | 0.2468 | 0.2044 |
|
| 98 |
-
| 0.1742 | 32.0 | 21056 | 0.1640 | 0.6798 | 0.2473 | 0.2048 |
|
| 99 |
-
| 0.1738 | 33.0 | 21714 | 0.1637 | 0.6807 | 0.2489 | 0.2037 |
|
| 100 |
-
| 0.1738 | 34.0 | 22372 | 0.1632 | 0.6818 | 0.2497 | 0.2079 |
|
| 101 |
-
| 0.1733 | 35.0 | 23030 | 0.1630 | 0.6831 | 0.2545 | 0.2087 |
|
| 102 |
-
| 0.1726 | 36.0 | 23688 | 0.1627 | 0.6831 | 0.2538 | 0.2087 |
|
| 103 |
-
| 0.1728 | 37.0 | 24346 | 0.1623 | 0.6834 | 0.2555 | 0.2130 |
|
| 104 |
-
| 0.1722 | 38.0 | 25004 | 0.1621 | 0.6840 | 0.2557 | 0.2106 |
|
| 105 |
-
| 0.172 | 39.0 | 25662 | 0.1617 | 0.6854 | 0.2571 | 0.2130 |
|
| 106 |
-
| 0.1716 | 40.0 | 26320 | 0.1616 | 0.6857 | 0.2595 | 0.2134 |
|
| 107 |
-
| 0.1715 | 41.0 | 26978 | 0.1613 | 0.6868 | 0.2627 | 0.2157 |
|
| 108 |
-
| 0.1708 | 42.0 | 27636 | 0.1609 | 0.6872 | 0.2629 | 0.2153 |
|
| 109 |
-
| 0.1708 | 43.0 | 28294 | 0.1608 | 0.6876 | 0.2645 | 0.2176 |
|
| 110 |
-
| 0.1707 | 44.0 | 28952 | 0.1605 | 0.6879 | 0.2645 | 0.2176 |
|
| 111 |
-
| 0.1706 | 45.0 | 29610 | 0.1602 | 0.6889 | 0.2656 | 0.2184 |
|
| 112 |
-
| 0.1704 | 46.0 | 30268 | 0.1600 | 0.6894 | 0.2673 | 0.2192 |
|
| 113 |
-
| 0.1701 | 47.0 | 30926 | 0.1599 | 0.6899 | 0.2676 | 0.2184 |
|
| 114 |
-
| 0.1699 | 48.0 | 31584 | 0.1597 | 0.6907 | 0.2695 | 0.2200 |
|
| 115 |
-
| 0.1696 | 49.0 | 32242 | 0.1595 | 0.6913 | 0.2694 | 0.2200 |
|
| 116 |
-
| 0.1694 | 50.0 | 32900 | 0.1592 | 0.6911 | 0.2701 | 0.2223 |
|
| 117 |
-
| 0.1691 | 51.0 | 33558 | 0.1591 | 0.6929 | 0.2738 | 0.2215 |
|
| 118 |
-
| 0.1689 | 52.0 | 34216 | 0.1589 | 0.6924 | 0.2724 | 0.2227 |
|
| 119 |
-
| 0.1689 | 53.0 | 34874 | 0.1588 | 0.6937 | 0.2761 | 0.2246 |
|
| 120 |
-
| 0.1689 | 54.0 | 35532 | 0.1585 | 0.6936 | 0.2770 | 0.2243 |
|
| 121 |
-
| 0.1683 | 55.0 | 36190 | 0.1583 | 0.6949 | 0.2786 | 0.2231 |
|
| 122 |
-
| 0.1684 | 56.0 | 36848 | 0.1582 | 0.6946 | 0.2792 | 0.2250 |
|
| 123 |
-
| 0.1677 | 57.0 | 37506 | 0.1582 | 0.6952 | 0.2832 | 0.2243 |
|
| 124 |
-
| 0.168 | 58.0 | 38164 | 0.1581 | 0.6965 | 0.2855 | 0.2258 |
|
| 125 |
-
| 0.1678 | 59.0 | 38822 | 0.1578 | 0.6966 | 0.2861 | 0.2277 |
|
| 126 |
-
| 0.1678 | 60.0 | 39480 | 0.1576 | 0.6975 | 0.2863 | 0.2285 |
|
| 127 |
-
| 0.1676 | 61.0 | 40138 | 0.1575 | 0.6975 | 0.2873 | 0.2289 |
|
| 128 |
-
| 0.1676 | 62.0 | 40796 | 0.1573 | 0.6979 | 0.2880 | 0.2301 |
|
| 129 |
-
| 0.1677 | 63.0 | 41454 | 0.1573 | 0.6977 | 0.2885 | 0.2285 |
|
| 130 |
-
| 0.1672 | 64.0 | 42112 | 0.1571 | 0.6983 | 0.2892 | 0.2328 |
|
| 131 |
-
| 0.1671 | 65.0 | 42770 | 0.1570 | 0.6990 | 0.2901 | 0.2312 |
|
| 132 |
-
| 0.167 | 66.0 | 43428 | 0.1567 | 0.6984 | 0.2898 | 0.2328 |
|
| 133 |
-
| 0.1671 | 67.0 | 44086 | 0.1567 | 0.6985 | 0.2892 | 0.2285 |
|
| 134 |
-
| 0.1668 | 68.0 | 44744 | 0.1566 | 0.6995 | 0.2908 | 0.2305 |
|
| 135 |
-
| 0.1667 | 69.0 | 45402 | 0.1564 | 0.6999 | 0.2928 | 0.2312 |
|
| 136 |
-
| 0.1661 | 70.0 | 46060 | 0.1563 | 0.7003 | 0.2944 | 0.2324 |
|
| 137 |
-
| 0.1665 | 71.0 | 46718 | 0.1563 | 0.7005 | 0.2932 | 0.2324 |
|
| 138 |
-
| 0.1665 | 72.0 | 47376 | 0.1561 | 0.7006 | 0.2934 | 0.2340 |
|
| 139 |
-
| 0.1664 | 73.0 | 48034 | 0.1560 | 0.7013 | 0.2949 | 0.2336 |
|
| 140 |
-
| 0.1663 | 74.0 | 48692 | 0.1559 | 0.7012 | 0.2932 | 0.2328 |
|
| 141 |
-
| 0.1662 | 75.0 | 49350 | 0.1558 | 0.7018 | 0.2960 | 0.2309 |
|
| 142 |
-
| 0.1662 | 76.0 | 50008 | 0.1557 | 0.7022 | 0.2958 | 0.2316 |
|
| 143 |
-
| 0.166 | 77.0 | 50666 | 0.1556 | 0.7012 | 0.2943 | 0.2340 |
|
| 144 |
-
| 0.1655 | 78.0 | 51324 | 0.1555 | 0.7024 | 0.2976 | 0.2332 |
|
| 145 |
-
| 0.1659 | 79.0 | 51982 | 0.1553 | 0.7020 | 0.2953 | 0.2344 |
|
| 146 |
-
| 0.1657 | 80.0 | 52640 | 0.1553 | 0.7029 | 0.2972 | 0.2328 |
|
| 147 |
-
| 0.1655 | 81.0 | 53298 | 0.1551 | 0.7027 | 0.2956 | 0.2347 |
|
| 148 |
-
| 0.1647 | 82.0 | 53956 | 0.1551 | 0.7038 | 0.3002 | 0.2355 |
|
| 149 |
-
| 0.1652 | 83.0 | 54614 | 0.1551 | 0.7036 | 0.2993 | 0.2336 |
|
| 150 |
-
| 0.1649 | 84.0 | 55272 | 0.1548 | 0.7028 | 0.2977 | 0.2367 |
|
| 151 |
-
| 0.1648 | 85.0 | 55930 | 0.1548 | 0.7035 | 0.2997 | 0.2367 |
|
| 152 |
-
| 0.1643 | 86.0 | 56588 | 0.1548 | 0.7034 | 0.3003 | 0.2375 |
|
| 153 |
-
| 0.1649 | 87.0 | 57246 | 0.1547 | 0.7041 | 0.3043 | 0.2390 |
|
| 154 |
-
| 0.1647 | 88.0 | 57904 | 0.1547 | 0.7052 | 0.3051 | 0.2344 |
|
| 155 |
-
| 0.1649 | 89.0 | 58562 | 0.1545 | 0.7043 | 0.3041 | 0.2390 |
|
| 156 |
-
| 0.1646 | 90.0 | 59220 | 0.1544 | 0.7053 | 0.3054 | 0.2390 |
|
| 157 |
-
| 0.1647 | 91.0 | 59878 | 0.1542 | 0.7055 | 0.3060 | 0.2414 |
|
| 158 |
-
| 0.1646 | 92.0 | 60536 | 0.1542 | 0.7054 | 0.3062 | 0.2410 |
|
| 159 |
-
| 0.1645 | 93.0 | 61194 | 0.1541 | 0.7053 | 0.3060 | 0.2425 |
|
| 160 |
-
| 0.1643 | 94.0 | 61852 | 0.1541 | 0.7054 | 0.3064 | 0.2421 |
|
| 161 |
-
| 0.1647 | 95.0 | 62510 | 0.1539 | 0.7069 | 0.3084 | 0.2417 |
|
| 162 |
-
| 0.1645 | 96.0 | 63168 | 0.1540 | 0.7058 | 0.3068 | 0.2410 |
|
| 163 |
-
| 0.1646 | 97.0 | 63826 | 0.1538 | 0.7056 | 0.3063 | 0.2414 |
|
| 164 |
-
| 0.1638 | 98.0 | 64484 | 0.1539 | 0.7054 | 0.3072 | 0.2379 |
|
| 165 |
-
| 0.1638 | 99.0 | 65142 | 0.1537 | 0.7060 | 0.3079 | 0.2437 |
|
| 166 |
-
| 0.164 | 100.0 | 65800 | 0.1537 | 0.7077 | 0.3098 | 0.2425 |
|
| 167 |
-
| 0.1636 | 101.0 | 66458 | 0.1536 | 0.7067 | 0.3087 | 0.2437 |
|
| 168 |
-
| 0.1643 | 102.0 | 67116 | 0.1536 | 0.7069 | 0.3095 | 0.2425 |
|
| 169 |
-
| 0.1642 | 103.0 | 67774 | 0.1535 | 0.7071 | 0.3096 | 0.2433 |
|
| 170 |
-
| 0.1634 | 104.0 | 68432 | 0.1534 | 0.7074 | 0.3108 | 0.2425 |
|
| 171 |
-
| 0.1639 | 105.0 | 69090 | 0.1533 | 0.7079 | 0.3111 | 0.2445 |
|
| 172 |
-
| 0.1638 | 106.0 | 69748 | 0.1532 | 0.7079 | 0.3145 | 0.2437 |
|
| 173 |
-
| 0.1634 | 107.0 | 70406 | 0.1532 | 0.7078 | 0.3135 | 0.2452 |
|
| 174 |
-
| 0.163 | 108.0 | 71064 | 0.1531 | 0.7079 | 0.3129 | 0.2449 |
|
| 175 |
-
| 0.1637 | 109.0 | 71722 | 0.1530 | 0.7086 | 0.3148 | 0.2437 |
|
| 176 |
-
| 0.1636 | 110.0 | 72380 | 0.1530 | 0.7089 | 0.3149 | 0.2449 |
|
| 177 |
-
| 0.1631 | 111.0 | 73038 | 0.1530 | 0.7089 | 0.3157 | 0.2441 |
|
| 178 |
-
| 0.1638 | 112.0 | 73696 | 0.1530 | 0.7092 | 0.3153 | 0.2414 |
|
| 179 |
-
| 0.1632 | 113.0 | 74354 | 0.1528 | 0.7087 | 0.3152 | 0.2433 |
|
| 180 |
-
| 0.1631 | 114.0 | 75012 | 0.1528 | 0.7091 | 0.3143 | 0.2449 |
|
| 181 |
-
| 0.1631 | 115.0 | 75670 | 0.1527 | 0.7088 | 0.3153 | 0.2437 |
|
| 182 |
-
| 0.1631 | 116.0 | 76328 | 0.1526 | 0.7095 | 0.3172 | 0.2452 |
|
| 183 |
-
| 0.163 | 117.0 | 76986 | 0.1526 | 0.7089 | 0.3156 | 0.2476 |
|
| 184 |
-
| 0.163 | 118.0 | 77644 | 0.1524 | 0.7095 | 0.3167 | 0.2483 |
|
| 185 |
-
| 0.1633 | 119.0 | 78302 | 0.1525 | 0.7101 | 0.3182 | 0.2449 |
|
| 186 |
-
| 0.1628 | 120.0 | 78960 | 0.1524 | 0.7097 | 0.3181 | 0.2476 |
|
| 187 |
-
| 0.163 | 121.0 | 79618 | 0.1525 | 0.7098 | 0.3172 | 0.2437 |
|
| 188 |
-
| 0.163 | 122.0 | 80276 | 0.1523 | 0.7099 | 0.3184 | 0.2460 |
|
| 189 |
-
| 0.1629 | 123.0 | 80934 | 0.1524 | 0.7098 | 0.3168 | 0.2425 |
|
| 190 |
-
| 0.1628 | 124.0 | 81592 | 0.1523 | 0.7104 | 0.3201 | 0.2437 |
|
| 191 |
-
| 0.1627 | 125.0 | 82250 | 0.1522 | 0.7104 | 0.3184 | 0.2449 |
|
| 192 |
-
| 0.1627 | 126.0 | 82908 | 0.1523 | 0.7104 | 0.3203 | 0.2433 |
|
| 193 |
-
| 0.1626 | 127.0 | 83566 | 0.1522 | 0.7110 | 0.3218 | 0.2441 |
|
| 194 |
-
| 0.1624 | 128.0 | 84224 | 0.1521 | 0.7108 | 0.3195 | 0.2441 |
|
| 195 |
-
| 0.1624 | 129.0 | 84882 | 0.1521 | 0.7105 | 0.3199 | 0.2445 |
|
| 196 |
-
| 0.1627 | 130.0 | 85540 | 0.1520 | 0.7108 | 0.3192 | 0.2468 |
|
| 197 |
-
| 0.1625 | 131.0 | 86198 | 0.1519 | 0.7105 | 0.3190 | 0.2460 |
|
| 198 |
-
| 0.1626 | 132.0 | 86856 | 0.1520 | 0.7113 | 0.3204 | 0.2437 |
|
| 199 |
-
| 0.1617 | 133.0 | 87514 | 0.1519 | 0.7109 | 0.3198 | 0.2449 |
|
| 200 |
-
| 0.1624 | 134.0 | 88172 | 0.1518 | 0.7109 | 0.3199 | 0.2433 |
|
| 201 |
-
| 0.1622 | 135.0 | 88830 | 0.1517 | 0.7102 | 0.3199 | 0.2464 |
|
| 202 |
-
| 0.1619 | 136.0 | 89488 | 0.1517 | 0.7114 | 0.3212 | 0.2464 |
|
| 203 |
-
| 0.1624 | 137.0 | 90146 | 0.1517 | 0.7121 | 0.3235 | 0.2429 |
|
| 204 |
-
| 0.1622 | 138.0 | 90804 | 0.1515 | 0.7121 | 0.3220 | 0.2483 |
|
| 205 |
-
| 0.1621 | 139.0 | 91462 | 0.1516 | 0.7120 | 0.3229 | 0.2456 |
|
| 206 |
-
| 0.1621 | 140.0 | 92120 | 0.1516 | 0.7117 | 0.3218 | 0.2456 |
|
| 207 |
-
| 0.1617 | 141.0 | 92778 | 0.1515 | 0.7116 | 0.3223 | 0.2449 |
|
| 208 |
-
| 0.1622 | 142.0 | 93436 | 0.1514 | 0.7111 | 0.3230 | 0.2476 |
|
| 209 |
-
| 0.162 | 143.0 | 94094 | 0.1514 | 0.7117 | 0.3214 | 0.2452 |
|
| 210 |
-
| 0.162 | 144.0 | 94752 | 0.1514 | 0.7121 | 0.3232 | 0.2495 |
|
| 211 |
-
| 0.1621 | 145.0 | 95410 | 0.1514 | 0.7123 | 0.3233 | 0.2476 |
|
| 212 |
-
| 0.1621 | 146.0 | 96068 | 0.1513 | 0.7117 | 0.3206 | 0.2495 |
|
| 213 |
-
| 0.1616 | 147.0 | 96726 | 0.1513 | 0.7125 | 0.3233 | 0.2464 |
|
| 214 |
-
| 0.1619 | 148.0 | 97384 | 0.1512 | 0.7128 | 0.3241 | 0.2511 |
|
| 215 |
-
| 0.1619 | 149.0 | 98042 | 0.1513 | 0.7128 | 0.3249 | 0.2487 |
|
| 216 |
-
| 0.1617 | 150.0 | 98700 | 0.1513 | 0.7130 | 0.3243 | 0.2449 |
|
| 217 |
-
| 0.1616 | 151.0 | 99358 | 0.1511 | 0.7122 | 0.3235 | 0.2472 |
|
| 218 |
-
| 0.1616 | 152.0 | 100016 | 0.1511 | 0.7123 | 0.3246 | 0.2468 |
|
| 219 |
-
| 0.1616 | 153.0 | 100674 | 0.1510 | 0.7129 | 0.3253 | 0.2503 |
|
| 220 |
-
| 0.1615 | 154.0 | 101332 | 0.1510 | 0.7124 | 0.3232 | 0.2480 |
|
| 221 |
-
| 0.1615 | 155.0 | 101990 | 0.1509 | 0.7125 | 0.3238 | 0.2483 |
|
| 222 |
-
| 0.1618 | 156.0 | 102648 | 0.1509 | 0.7131 | 0.3251 | 0.2464 |
|
| 223 |
-
| 0.1616 | 157.0 | 103306 | 0.1509 | 0.7137 | 0.3250 | 0.2495 |
|
| 224 |
-
| 0.1615 | 158.0 | 103964 | 0.1508 | 0.7137 | 0.3251 | 0.2483 |
|
| 225 |
-
| 0.1616 | 159.0 | 104622 | 0.1509 | 0.7138 | 0.3273 | 0.2476 |
|
| 226 |
-
| 0.1617 | 160.0 | 105280 | 0.1508 | 0.7134 | 0.3238 | 0.2499 |
|
| 227 |
-
| 0.1616 | 161.0 | 105938 | 0.1508 | 0.7138 | 0.3280 | 0.2491 |
|
| 228 |
-
| 0.1612 | 162.0 | 106596 | 0.1508 | 0.7134 | 0.3260 | 0.2468 |
|
| 229 |
-
| 0.1611 | 163.0 | 107254 | 0.1507 | 0.7136 | 0.3266 | 0.2499 |
|
| 230 |
-
| 0.1614 | 164.0 | 107912 | 0.1506 | 0.7134 | 0.3258 | 0.2515 |
|
| 231 |
-
| 0.1614 | 165.0 | 108570 | 0.1508 | 0.7139 | 0.3265 | 0.2483 |
|
| 232 |
-
| 0.1612 | 166.0 | 109228 | 0.1506 | 0.7138 | 0.3253 | 0.2495 |
|
| 233 |
-
| 0.1614 | 167.0 | 109886 | 0.1507 | 0.7144 | 0.3288 | 0.2487 |
|
| 234 |
-
| 0.1615 | 168.0 | 110544 | 0.1506 | 0.7143 | 0.3294 | 0.2483 |
|
| 235 |
-
| 0.1617 | 169.0 | 111202 | 0.1506 | 0.7138 | 0.3276 | 0.2464 |
|
| 236 |
-
| 0.161 | 170.0 | 111860 | 0.1505 | 0.7144 | 0.3290 | 0.2487 |
|
| 237 |
-
| 0.1612 | 171.0 | 112518 | 0.1505 | 0.7141 | 0.3263 | 0.2476 |
|
| 238 |
-
| 0.161 | 172.0 | 113176 | 0.1504 | 0.7135 | 0.3258 | 0.2511 |
|
| 239 |
-
| 0.1615 | 173.0 | 113834 | 0.1504 | 0.7142 | 0.3270 | 0.2503 |
|
| 240 |
-
| 0.1609 | 174.0 | 114492 | 0.1504 | 0.7145 | 0.3278 | 0.2507 |
|
| 241 |
-
| 0.1612 | 175.0 | 115150 | 0.1504 | 0.7143 | 0.3285 | 0.2499 |
|
| 242 |
-
| 0.1613 | 176.0 | 115808 | 0.1503 | 0.7144 | 0.3277 | 0.2507 |
|
| 243 |
-
| 0.161 | 177.0 | 116466 | 0.1504 | 0.7146 | 0.3297 | 0.2487 |
|
| 244 |
-
| 0.161 | 178.0 | 117124 | 0.1504 | 0.7146 | 0.3290 | 0.2483 |
|
| 245 |
-
| 0.1611 | 179.0 | 117782 | 0.1502 | 0.7146 | 0.3285 | 0.2530 |
|
| 246 |
-
| 0.1609 | 180.0 | 118440 | 0.1503 | 0.7147 | 0.3301 | 0.2503 |
|
| 247 |
-
| 0.1612 | 181.0 | 119098 | 0.1501 | 0.7145 | 0.3294 | 0.2507 |
|
| 248 |
-
| 0.161 | 182.0 | 119756 | 0.1502 | 0.7143 | 0.3286 | 0.2503 |
|
| 249 |
-
| 0.1606 | 183.0 | 120414 | 0.1501 | 0.7151 | 0.3296 | 0.2530 |
|
| 250 |
-
| 0.161 | 184.0 | 121072 | 0.1501 | 0.7148 | 0.3285 | 0.2522 |
|
| 251 |
-
| 0.1609 | 185.0 | 121730 | 0.1501 | 0.7148 | 0.3298 | 0.2511 |
|
| 252 |
-
| 0.1609 | 186.0 | 122388 | 0.1500 | 0.7149 | 0.3285 | 0.2518 |
|
| 253 |
-
| 0.1612 | 187.0 | 123046 | 0.1501 | 0.7146 | 0.3288 | 0.2483 |
|
| 254 |
-
| 0.1606 | 188.0 | 123704 | 0.1500 | 0.7157 | 0.3302 | 0.2550 |
|
| 255 |
-
| 0.1609 | 189.0 | 124362 | 0.1500 | 0.7156 | 0.3302 | 0.2518 |
|
| 256 |
-
| 0.1605 | 190.0 | 125020 | 0.1500 | 0.7152 | 0.3295 | 0.2491 |
|
| 257 |
-
| 0.161 | 191.0 | 125678 | 0.1500 | 0.7166 | 0.3332 | 0.2499 |
|
| 258 |
-
| 0.1606 | 192.0 | 126336 | 0.1499 | 0.7160 | 0.3324 | 0.2495 |
|
| 259 |
-
| 0.1604 | 193.0 | 126994 | 0.1500 | 0.7153 | 0.3304 | 0.2483 |
|
| 260 |
-
| 0.1609 | 194.0 | 127652 | 0.1498 | 0.7161 | 0.3315 | 0.2518 |
|
| 261 |
-
| 0.1609 | 195.0 | 128310 | 0.1498 | 0.7163 | 0.3304 | 0.2522 |
|
| 262 |
-
| 0.1609 | 196.0 | 128968 | 0.1498 | 0.7155 | 0.3296 | 0.2518 |
|
| 263 |
-
| 0.1604 | 197.0 | 129626 | 0.1498 | 0.7159 | 0.3302 | 0.2530 |
|
| 264 |
-
| 0.161 | 198.0 | 130284 | 0.1498 | 0.7160 | 0.3308 | 0.2522 |
|
| 265 |
-
| 0.1602 | 199.0 | 130942 | 0.1498 | 0.7161 | 0.3320 | 0.2507 |
|
| 266 |
-
| 0.1606 | 200.0 | 131600 | 0.1498 | 0.7161 | 0.3314 | 0.2515 |
|
| 267 |
-
| 0.1609 | 201.0 | 132258 | 0.1498 | 0.7168 | 0.3338 | 0.2483 |
|
| 268 |
-
| 0.1603 | 202.0 | 132916 | 0.1497 | 0.7163 | 0.3323 | 0.2507 |
|
| 269 |
-
| 0.1603 | 203.0 | 133574 | 0.1497 | 0.7169 | 0.3341 | 0.2511 |
|
| 270 |
-
| 0.1603 | 204.0 | 134232 | 0.1496 | 0.7165 | 0.3338 | 0.2495 |
|
| 271 |
-
| 0.1605 | 205.0 | 134890 | 0.1496 | 0.7163 | 0.3331 | 0.2518 |
|
| 272 |
-
| 0.1605 | 206.0 | 135548 | 0.1495 | 0.7169 | 0.3305 | 0.2553 |
|
| 273 |
-
| 0.1605 | 207.0 | 136206 | 0.1495 | 0.7166 | 0.3317 | 0.2518 |
|
| 274 |
-
| 0.1601 | 208.0 | 136864 | 0.1496 | 0.7162 | 0.3318 | 0.2511 |
|
| 275 |
-
| 0.1604 | 209.0 | 137522 | 0.1496 | 0.7167 | 0.3333 | 0.2526 |
|
| 276 |
-
| 0.1602 | 210.0 | 138180 | 0.1494 | 0.7172 | 0.3346 | 0.2542 |
|
| 277 |
-
| 0.1604 | 211.0 | 138838 | 0.1495 | 0.7170 | 0.3364 | 0.2511 |
|
| 278 |
-
| 0.1608 | 212.0 | 139496 | 0.1494 | 0.7173 | 0.3355 | 0.2546 |
|
| 279 |
-
| 0.16 | 213.0 | 140154 | 0.1494 | 0.7177 | 0.3370 | 0.2515 |
|
| 280 |
-
| 0.1601 | 214.0 | 140812 | 0.1494 | 0.7178 | 0.3363 | 0.2546 |
|
| 281 |
-
| 0.1604 | 215.0 | 141470 | 0.1493 | 0.7175 | 0.3363 | 0.2565 |
|
| 282 |
-
| 0.1607 | 216.0 | 142128 | 0.1493 | 0.7168 | 0.3354 | 0.2534 |
|
| 283 |
-
| 0.1598 | 217.0 | 142786 | 0.1493 | 0.7175 | 0.3354 | 0.2534 |
|
| 284 |
-
| 0.1603 | 218.0 | 143444 | 0.1494 | 0.7173 | 0.3365 | 0.2518 |
|
| 285 |
-
| 0.1603 | 219.0 | 144102 | 0.1493 | 0.7174 | 0.3347 | 0.2542 |
|
| 286 |
-
| 0.1603 | 220.0 | 144760 | 0.1493 | 0.7174 | 0.3357 | 0.2546 |
|
| 287 |
-
| 0.1602 | 221.0 | 145418 | 0.1494 | 0.7170 | 0.3373 | 0.2507 |
|
| 288 |
-
| 0.1601 | 222.0 | 146076 | 0.1493 | 0.7172 | 0.3361 | 0.2542 |
|
| 289 |
-
| 0.1601 | 223.0 | 146734 | 0.1492 | 0.7176 | 0.3369 | 0.2557 |
|
| 290 |
-
| 0.1597 | 224.0 | 147392 | 0.1493 | 0.7177 | 0.3379 | 0.2526 |
|
| 291 |
-
| 0.1602 | 225.0 | 148050 | 0.1492 | 0.7176 | 0.3377 | 0.2507 |
|
| 292 |
-
| 0.16 | 226.0 | 148708 | 0.1492 | 0.7175 | 0.3359 | 0.2546 |
|
| 293 |
-
| 0.1603 | 227.0 | 149366 | 0.1492 | 0.7180 | 0.3374 | 0.2522 |
|
| 294 |
-
| 0.1604 | 228.0 | 150024 | 0.1492 | 0.7180 | 0.3391 | 0.2522 |
|
| 295 |
-
| 0.1599 | 229.0 | 150682 | 0.1491 | 0.7177 | 0.3366 | 0.2550 |
|
| 296 |
-
| 0.1596 | 230.0 | 151340 | 0.1490 | 0.7178 | 0.3368 | 0.2542 |
|
| 297 |
-
| 0.1602 | 231.0 | 151998 | 0.1490 | 0.7177 | 0.3363 | 0.2553 |
|
| 298 |
-
| 0.1602 | 232.0 | 152656 | 0.1491 | 0.7179 | 0.3365 | 0.2553 |
|
| 299 |
-
| 0.16 | 233.0 | 153314 | 0.1491 | 0.7187 | 0.3385 | 0.2553 |
|
| 300 |
-
| 0.1601 | 234.0 | 153972 | 0.1490 | 0.7176 | 0.3366 | 0.2538 |
|
| 301 |
-
| 0.16 | 235.0 | 154630 | 0.1490 | 0.7175 | 0.3357 | 0.2526 |
|
| 302 |
-
| 0.1603 | 236.0 | 155288 | 0.1490 | 0.7179 | 0.3375 | 0.2518 |
|
| 303 |
-
| 0.1592 | 237.0 | 155946 | 0.1490 | 0.7180 | 0.3369 | 0.2534 |
|
| 304 |
-
| 0.1602 | 238.0 | 156604 | 0.1490 | 0.7174 | 0.3366 | 0.2526 |
|
| 305 |
-
| 0.16 | 239.0 | 157262 | 0.1490 | 0.7178 | 0.3377 | 0.2511 |
|
| 306 |
-
| 0.1601 | 240.0 | 157920 | 0.1489 | 0.7180 | 0.3367 | 0.2534 |
|
| 307 |
-
| 0.1602 | 241.0 | 158578 | 0.1490 | 0.7184 | 0.3381 | 0.2515 |
|
| 308 |
-
| 0.16 | 242.0 | 159236 | 0.1488 | 0.7186 | 0.3375 | 0.2553 |
|
| 309 |
-
| 0.1598 | 243.0 | 159894 | 0.1489 | 0.7182 | 0.3370 | 0.2553 |
|
| 310 |
-
| 0.1595 | 244.0 | 160552 | 0.1489 | 0.7183 | 0.3381 | 0.2530 |
|
| 311 |
-
| 0.1596 | 245.0 | 161210 | 0.1488 | 0.7175 | 0.3364 | 0.2534 |
|
| 312 |
-
| 0.1599 | 246.0 | 161868 | 0.1489 | 0.7183 | 0.3374 | 0.2522 |
|
| 313 |
-
| 0.1597 | 247.0 | 162526 | 0.1488 | 0.7187 | 0.3379 | 0.2542 |
|
| 314 |
-
| 0.1598 | 248.0 | 163184 | 0.1488 | 0.7184 | 0.3370 | 0.2569 |
|
| 315 |
-
| 0.1595 | 249.0 | 163842 | 0.1488 | 0.7181 | 0.3373 | 0.2511 |
|
| 316 |
-
| 0.16 | 250.0 | 164500 | 0.1488 | 0.7183 | 0.3383 | 0.2538 |
|
| 317 |
-
| 0.1598 | 251.0 | 165158 | 0.1488 | 0.7192 | 0.3387 | 0.2542 |
|
| 318 |
-
| 0.1594 | 252.0 | 165816 | 0.1487 | 0.7184 | 0.3376 | 0.2561 |
|
| 319 |
-
| 0.1598 | 253.0 | 166474 | 0.1487 | 0.7186 | 0.3392 | 0.2550 |
|
| 320 |
-
| 0.1599 | 254.0 | 167132 | 0.1488 | 0.7183 | 0.3374 | 0.2507 |
|
| 321 |
-
| 0.1595 | 255.0 | 167790 | 0.1487 | 0.7183 | 0.3377 | 0.2530 |
|
| 322 |
-
| 0.1592 | 256.0 | 168448 | 0.1487 | 0.7188 | 0.3387 | 0.2515 |
|
| 323 |
-
| 0.1601 | 257.0 | 169106 | 0.1487 | 0.7187 | 0.3389 | 0.2538 |
|
| 324 |
-
| 0.1595 | 258.0 | 169764 | 0.1487 | 0.7187 | 0.3395 | 0.2557 |
|
| 325 |
-
| 0.1592 | 259.0 | 170422 | 0.1486 | 0.7179 | 0.3373 | 0.2550 |
|
| 326 |
-
| 0.1597 | 260.0 | 171080 | 0.1487 | 0.7189 | 0.3389 | 0.2526 |
|
| 327 |
-
| 0.1597 | 261.0 | 171738 | 0.1487 | 0.7186 | 0.3385 | 0.2546 |
|
| 328 |
-
| 0.1594 | 262.0 | 172396 | 0.1486 | 0.7186 | 0.3372 | 0.2565 |
|
| 329 |
-
| 0.1598 | 263.0 | 173054 | 0.1487 | 0.7187 | 0.3380 | 0.2538 |
|
| 330 |
-
| 0.1595 | 264.0 | 173712 | 0.1486 | 0.7190 | 0.3386 | 0.2553 |
|
| 331 |
-
| 0.1594 | 265.0 | 174370 | 0.1485 | 0.7185 | 0.3380 | 0.2569 |
|
| 332 |
-
| 0.1594 | 266.0 | 175028 | 0.1486 | 0.7190 | 0.3384 | 0.2526 |
|
| 333 |
-
| 0.1594 | 267.0 | 175686 | 0.1485 | 0.7190 | 0.3385 | 0.2550 |
|
| 334 |
-
| 0.1592 | 268.0 | 176344 | 0.1486 | 0.7194 | 0.3389 | 0.2557 |
|
| 335 |
-
| 0.1593 | 269.0 | 177002 | 0.1486 | 0.7187 | 0.3397 | 0.2522 |
|
| 336 |
-
| 0.1592 | 270.0 | 177660 | 0.1485 | 0.7195 | 0.3392 | 0.2538 |
|
| 337 |
-
| 0.1597 | 271.0 | 178318 | 0.1486 | 0.7190 | 0.3389 | 0.2515 |
|
| 338 |
-
| 0.1593 | 272.0 | 178976 | 0.1485 | 0.7186 | 0.3382 | 0.2557 |
|
| 339 |
-
| 0.1589 | 273.0 | 179634 | 0.1485 | 0.7188 | 0.3384 | 0.2546 |
|
| 340 |
-
| 0.1595 | 274.0 | 180292 | 0.1484 | 0.7190 | 0.3379 | 0.2553 |
|
| 341 |
-
| 0.1593 | 275.0 | 180950 | 0.1485 | 0.7189 | 0.3382 | 0.2534 |
|
| 342 |
-
| 0.1595 | 276.0 | 181608 | 0.1485 | 0.7189 | 0.3382 | 0.2542 |
|
| 343 |
-
| 0.1595 | 277.0 | 182266 | 0.1485 | 0.7189 | 0.3384 | 0.2542 |
|
| 344 |
-
| 0.159 | 278.0 | 182924 | 0.1484 | 0.7193 | 0.3393 | 0.2550 |
|
| 345 |
-
| 0.1592 | 279.0 | 183582 | 0.1484 | 0.7191 | 0.3397 | 0.2542 |
|
| 346 |
-
| 0.1592 | 280.0 | 184240 | 0.1484 | 0.7196 | 0.3395 | 0.2569 |
|
| 347 |
-
| 0.1593 | 281.0 | 184898 | 0.1484 | 0.7193 | 0.3398 | 0.2565 |
|
| 348 |
-
| 0.1588 | 282.0 | 185556 | 0.1484 | 0.7194 | 0.3399 | 0.2553 |
|
| 349 |
-
| 0.1596 | 283.0 | 186214 | 0.1483 | 0.7195 | 0.3404 | 0.2565 |
|
| 350 |
-
| 0.1591 | 284.0 | 186872 | 0.1483 | 0.7188 | 0.3383 | 0.2550 |
|
| 351 |
-
| 0.1592 | 285.0 | 187530 | 0.1483 | 0.7187 | 0.3390 | 0.2534 |
|
| 352 |
-
| 0.1588 | 286.0 | 188188 | 0.1484 | 0.7192 | 0.3417 | 0.2542 |
|
| 353 |
-
| 0.159 | 287.0 | 188846 | 0.1483 | 0.7194 | 0.3401 | 0.2569 |
|
| 354 |
-
| 0.1594 | 288.0 | 189504 | 0.1484 | 0.7199 | 0.3401 | 0.2522 |
|
| 355 |
-
| 0.1589 | 289.0 | 190162 | 0.1483 | 0.7199 | 0.3407 | 0.2557 |
|
| 356 |
-
| 0.1589 | 290.0 | 190820 | 0.1483 | 0.7193 | 0.3394 | 0.2546 |
|
| 357 |
-
| 0.1589 | 291.0 | 191478 | 0.1482 | 0.7199 | 0.3406 | 0.2557 |
|
| 358 |
-
| 0.1593 | 292.0 | 192136 | 0.1482 | 0.7192 | 0.3393 | 0.2557 |
|
| 359 |
-
| 0.1595 | 293.0 | 192794 | 0.1482 | 0.7191 | 0.3398 | 0.2550 |
|
| 360 |
-
| 0.1592 | 294.0 | 193452 | 0.1483 | 0.7198 | 0.3396 | 0.2565 |
|
| 361 |
-
| 0.1598 | 295.0 | 194110 | 0.1482 | 0.7196 | 0.3399 | 0.2550 |
|
| 362 |
-
| 0.1592 | 296.0 | 194768 | 0.1483 | 0.7202 | 0.3418 | 0.2557 |
|
| 363 |
-
| 0.1596 | 297.0 | 195426 | 0.1482 | 0.7194 | 0.3395 | 0.2553 |
|
| 364 |
-
| 0.1592 | 298.0 | 196084 | 0.1483 | 0.7199 | 0.3413 | 0.2526 |
|
| 365 |
-
| 0.1593 | 299.0 | 196742 | 0.1482 | 0.7201 | 0.3421 | 0.2542 |
|
| 366 |
-
| 0.1592 | 300.0 | 197400 | 0.1482 | 0.7204 | 0.3423 | 0.2534 |
|
| 367 |
-
| 0.1591 | 301.0 | 198058 | 0.1481 | 0.7198 | 0.3420 | 0.2557 |
|
| 368 |
-
| 0.159 | 302.0 | 198716 | 0.1481 | 0.7201 | 0.3411 | 0.2557 |
|
| 369 |
-
| 0.1589 | 303.0 | 199374 | 0.1482 | 0.7199 | 0.3420 | 0.2550 |
|
| 370 |
-
| 0.1592 | 304.0 | 200032 | 0.1481 | 0.7202 | 0.3416 | 0.2565 |
|
| 371 |
-
| 0.1593 | 305.0 | 200690 | 0.1481 | 0.7202 | 0.3417 | 0.2561 |
|
| 372 |
-
| 0.1588 | 306.0 | 201348 | 0.1481 | 0.7201 | 0.3425 | 0.2557 |
|
| 373 |
-
| 0.159 | 307.0 | 202006 | 0.1481 | 0.7204 | 0.3417 | 0.2557 |
|
| 374 |
-
| 0.1589 | 308.0 | 202664 | 0.1481 | 0.7206 | 0.3422 | 0.2561 |
|
| 375 |
-
| 0.1584 | 309.0 | 203322 | 0.1481 | 0.7198 | 0.3421 | 0.2561 |
|
| 376 |
-
| 0.1588 | 310.0 | 203980 | 0.1481 | 0.7211 | 0.3434 | 0.2553 |
|
| 377 |
-
| 0.1591 | 311.0 | 204638 | 0.1481 | 0.7207 | 0.3435 | 0.2553 |
|
| 378 |
-
| 0.159 | 312.0 | 205296 | 0.1480 | 0.7204 | 0.3429 | 0.2546 |
|
| 379 |
-
| 0.1592 | 313.0 | 205954 | 0.1480 | 0.7209 | 0.3435 | 0.2550 |
|
| 380 |
-
| 0.1597 | 314.0 | 206612 | 0.1480 | 0.7209 | 0.3419 | 0.2565 |
|
| 381 |
-
| 0.1589 | 315.0 | 207270 | 0.1480 | 0.7202 | 0.3427 | 0.2553 |
|
| 382 |
-
| 0.1588 | 316.0 | 207928 | 0.1481 | 0.7209 | 0.3440 | 0.2550 |
|
| 383 |
-
| 0.1589 | 317.0 | 208586 | 0.1479 | 0.7203 | 0.3422 | 0.2565 |
|
| 384 |
-
| 0.1591 | 318.0 | 209244 | 0.1479 | 0.7203 | 0.3423 | 0.2573 |
|
| 385 |
-
| 0.1591 | 319.0 | 209902 | 0.1479 | 0.7201 | 0.3425 | 0.2569 |
|
| 386 |
-
| 0.159 | 320.0 | 210560 | 0.1479 | 0.7206 | 0.3432 | 0.2561 |
|
| 387 |
-
| 0.1588 | 321.0 | 211218 | 0.1480 | 0.7205 | 0.3433 | 0.2561 |
|
| 388 |
-
| 0.1591 | 322.0 | 211876 | 0.1479 | 0.7204 | 0.3417 | 0.2573 |
|
| 389 |
-
| 0.159 | 323.0 | 212534 | 0.1479 | 0.7199 | 0.3411 | 0.2565 |
|
| 390 |
-
| 0.1588 | 324.0 | 213192 | 0.1479 | 0.7208 | 0.3442 | 0.2573 |
|
| 391 |
-
| 0.1592 | 325.0 | 213850 | 0.1479 | 0.7204 | 0.3417 | 0.2581 |
|
| 392 |
-
| 0.1588 | 326.0 | 214508 | 0.1478 | 0.7205 | 0.3418 | 0.2565 |
|
| 393 |
-
| 0.1591 | 327.0 | 215166 | 0.1479 | 0.7209 | 0.3435 | 0.2581 |
|
| 394 |
-
| 0.1588 | 328.0 | 215824 | 0.1479 | 0.7206 | 0.3437 | 0.2561 |
|
| 395 |
-
| 0.1585 | 329.0 | 216482 | 0.1478 | 0.7206 | 0.3428 | 0.2573 |
|
| 396 |
-
| 0.1587 | 330.0 | 217140 | 0.1478 | 0.7209 | 0.3429 | 0.2588 |
|
| 397 |
-
| 0.1586 | 331.0 | 217798 | 0.1478 | 0.7204 | 0.3415 | 0.2573 |
|
| 398 |
-
| 0.1588 | 332.0 | 218456 | 0.1478 | 0.7207 | 0.3428 | 0.2577 |
|
| 399 |
-
| 0.1591 | 333.0 | 219114 | 0.1478 | 0.7212 | 0.3431 | 0.2585 |
|
| 400 |
-
| 0.1592 | 334.0 | 219772 | 0.1478 | 0.7207 | 0.3435 | 0.2573 |
|
| 401 |
-
| 0.1588 | 335.0 | 220430 | 0.1478 | 0.7211 | 0.3429 | 0.2592 |
|
| 402 |
-
| 0.1583 | 336.0 | 221088 | 0.1477 | 0.7210 | 0.3436 | 0.2577 |
|
| 403 |
-
| 0.159 | 337.0 | 221746 | 0.1478 | 0.7205 | 0.3431 | 0.2565 |
|
| 404 |
-
| 0.1588 | 338.0 | 222404 | 0.1478 | 0.7206 | 0.3428 | 0.2569 |
|
| 405 |
-
| 0.1587 | 339.0 | 223062 | 0.1478 | 0.7204 | 0.3419 | 0.2577 |
|
| 406 |
-
| 0.1588 | 340.0 | 223720 | 0.1477 | 0.7205 | 0.3426 | 0.2577 |
|
| 407 |
-
| 0.1588 | 341.0 | 224378 | 0.1477 | 0.7212 | 0.3445 | 0.2561 |
|
| 408 |
-
| 0.1592 | 342.0 | 225036 | 0.1477 | 0.7206 | 0.3431 | 0.2565 |
|
| 409 |
-
| 0.1587 | 343.0 | 225694 | 0.1477 | 0.7211 | 0.3429 | 0.2581 |
|
| 410 |
-
| 0.1587 | 344.0 | 226352 | 0.1477 | 0.7208 | 0.3434 | 0.2577 |
|
| 411 |
-
| 0.1587 | 345.0 | 227010 | 0.1477 | 0.7211 | 0.3444 | 0.2592 |
|
| 412 |
-
| 0.1589 | 346.0 | 227668 | 0.1477 | 0.7206 | 0.3437 | 0.2569 |
|
| 413 |
-
| 0.1588 | 347.0 | 228326 | 0.1477 | 0.7208 | 0.3447 | 0.2573 |
|
| 414 |
-
| 0.1585 | 348.0 | 228984 | 0.1477 | 0.7214 | 0.3441 | 0.2577 |
|
| 415 |
-
| 0.1587 | 349.0 | 229642 | 0.1477 | 0.7211 | 0.3443 | 0.2577 |
|
| 416 |
-
| 0.1589 | 350.0 | 230300 | 0.1476 | 0.7214 | 0.3450 | 0.2581 |
|
| 417 |
-
| 0.1592 | 351.0 | 230958 | 0.1477 | 0.7210 | 0.3445 | 0.2565 |
|
| 418 |
-
| 0.1585 | 352.0 | 231616 | 0.1477 | 0.7212 | 0.3444 | 0.2581 |
|
| 419 |
-
| 0.1584 | 353.0 | 232274 | 0.1477 | 0.7208 | 0.3440 | 0.2573 |
|
| 420 |
-
| 0.1586 | 354.0 | 232932 | 0.1477 | 0.7217 | 0.3462 | 0.2565 |
|
| 421 |
-
| 0.1589 | 355.0 | 233590 | 0.1476 | 0.7213 | 0.3439 | 0.2577 |
|
| 422 |
-
| 0.1588 | 356.0 | 234248 | 0.1477 | 0.7211 | 0.3436 | 0.2581 |
|
| 423 |
-
| 0.1585 | 357.0 | 234906 | 0.1476 | 0.7209 | 0.3436 | 0.2581 |
|
| 424 |
-
| 0.1586 | 358.0 | 235564 | 0.1476 | 0.7216 | 0.3438 | 0.2588 |
|
| 425 |
-
| 0.159 | 359.0 | 236222 | 0.1476 | 0.7210 | 0.3436 | 0.2569 |
|
| 426 |
-
| 0.1587 | 360.0 | 236880 | 0.1477 | 0.7214 | 0.3446 | 0.2573 |
|
| 427 |
-
| 0.1582 | 361.0 | 237538 | 0.1477 | 0.7212 | 0.3457 | 0.2581 |
|
| 428 |
-
| 0.1587 | 362.0 | 238196 | 0.1476 | 0.7212 | 0.3440 | 0.2565 |
|
| 429 |
-
| 0.1584 | 363.0 | 238854 | 0.1475 | 0.7215 | 0.3441 | 0.2592 |
|
| 430 |
-
| 0.1587 | 364.0 | 239512 | 0.1476 | 0.7214 | 0.3445 | 0.2569 |
|
| 431 |
-
| 0.1586 | 365.0 | 240170 | 0.1475 | 0.7211 | 0.3457 | 0.2577 |
|
| 432 |
-
| 0.1589 | 366.0 | 240828 | 0.1475 | 0.7213 | 0.3458 | 0.2577 |
|
| 433 |
-
| 0.1586 | 367.0 | 241486 | 0.1475 | 0.7215 | 0.3453 | 0.2573 |
|
| 434 |
-
| 0.1588 | 368.0 | 242144 | 0.1476 | 0.7213 | 0.3448 | 0.2565 |
|
| 435 |
-
| 0.1587 | 369.0 | 242802 | 0.1475 | 0.7215 | 0.3450 | 0.2577 |
|
| 436 |
-
| 0.1586 | 370.0 | 243460 | 0.1475 | 0.7214 | 0.3448 | 0.2577 |
|
| 437 |
-
| 0.1587 | 371.0 | 244118 | 0.1475 | 0.7215 | 0.3453 | 0.2577 |
|
| 438 |
-
| 0.1585 | 372.0 | 244776 | 0.1475 | 0.7211 | 0.3443 | 0.2577 |
|
| 439 |
-
| 0.1589 | 373.0 | 245434 | 0.1475 | 0.7218 | 0.3457 | 0.2577 |
|
| 440 |
-
| 0.1583 | 374.0 | 246092 | 0.1475 | 0.7216 | 0.3457 | 0.2577 |
|
| 441 |
-
| 0.1587 | 375.0 | 246750 | 0.1475 | 0.7216 | 0.3457 | 0.2581 |
|
| 442 |
-
| 0.1587 | 376.0 | 247408 | 0.1474 | 0.7214 | 0.3449 | 0.2569 |
|
| 443 |
-
| 0.1583 | 377.0 | 248066 | 0.1474 | 0.7214 | 0.3450 | 0.2573 |
|
| 444 |
-
| 0.1587 | 378.0 | 248724 | 0.1475 | 0.7215 | 0.3449 | 0.2581 |
|
| 445 |
-
| 0.1588 | 379.0 | 249382 | 0.1475 | 0.7211 | 0.3437 | 0.2581 |
|
| 446 |
-
| 0.1585 | 380.0 | 250040 | 0.1476 | 0.7210 | 0.3442 | 0.2592 |
|
| 447 |
-
| 0.1587 | 381.0 | 250698 | 0.1475 | 0.7218 | 0.3454 | 0.2592 |
|
| 448 |
-
| 0.1587 | 382.0 | 251356 | 0.1475 | 0.7214 | 0.3435 | 0.2581 |
|
| 449 |
-
| 0.159 | 383.0 | 252014 | 0.1474 | 0.7214 | 0.3445 | 0.2585 |
|
| 450 |
-
| 0.1586 | 384.0 | 252672 | 0.1474 | 0.7216 | 0.3458 | 0.2585 |
|
| 451 |
-
| 0.1584 | 385.0 | 253330 | 0.1474 | 0.7214 | 0.3447 | 0.2581 |
|
| 452 |
-
| 0.1586 | 386.0 | 253988 | 0.1474 | 0.7218 | 0.3455 | 0.2585 |
|
| 453 |
-
| 0.1589 | 387.0 | 254646 | 0.1474 | 0.7215 | 0.3459 | 0.2585 |
|
| 454 |
-
| 0.1585 | 388.0 | 255304 | 0.1474 | 0.7219 | 0.3460 | 0.2581 |
|
| 455 |
-
| 0.1582 | 389.0 | 255962 | 0.1474 | 0.7216 | 0.3457 | 0.2585 |
|
| 456 |
-
| 0.1586 | 390.0 | 256620 | 0.1475 | 0.7217 | 0.3459 | 0.2569 |
|
| 457 |
-
| 0.1581 | 391.0 | 257278 | 0.1473 | 0.7220 | 0.3461 | 0.2592 |
|
| 458 |
-
| 0.1586 | 392.0 | 257936 | 0.1474 | 0.7220 | 0.3468 | 0.2577 |
|
| 459 |
-
| 0.1585 | 393.0 | 258594 | 0.1474 | 0.7220 | 0.3462 | 0.2588 |
|
| 460 |
-
| 0.1584 | 394.0 | 259252 | 0.1474 | 0.7222 | 0.3475 | 0.2577 |
|
| 461 |
-
| 0.1584 | 395.0 | 259910 | 0.1474 | 0.7218 | 0.3479 | 0.2577 |
|
| 462 |
-
| 0.1584 | 396.0 | 260568 | 0.1474 | 0.7219 | 0.3474 | 0.2573 |
|
| 463 |
-
| 0.1582 | 397.0 | 261226 | 0.1473 | 0.7220 | 0.3460 | 0.2585 |
|
| 464 |
-
| 0.1585 | 398.0 | 261884 | 0.1473 | 0.7222 | 0.3472 | 0.2588 |
|
| 465 |
-
| 0.1588 | 399.0 | 262542 | 0.1473 | 0.7221 | 0.3472 | 0.2596 |
|
| 466 |
-
| 0.1584 | 400.0 | 263200 | 0.1474 | 0.7223 | 0.3459 | 0.2588 |
|
| 467 |
-
| 0.1578 | 401.0 | 263858 | 0.1474 | 0.7225 | 0.3472 | 0.2581 |
|
| 468 |
-
| 0.1581 | 402.0 | 264516 | 0.1473 | 0.7223 | 0.3479 | 0.2573 |
|
| 469 |
-
| 0.1588 | 403.0 | 265174 | 0.1474 | 0.7223 | 0.3494 | 0.2585 |
|
| 470 |
-
| 0.158 | 404.0 | 265832 | 0.1474 | 0.7218 | 0.3461 | 0.2581 |
|
| 471 |
-
| 0.1586 | 405.0 | 266490 | 0.1473 | 0.7217 | 0.3450 | 0.2581 |
|
| 472 |
-
| 0.1582 | 406.0 | 267148 | 0.1472 | 0.7219 | 0.3462 | 0.2588 |
|
| 473 |
-
| 0.1589 | 407.0 | 267806 | 0.1473 | 0.7227 | 0.3476 | 0.2588 |
|
| 474 |
-
| 0.1585 | 408.0 | 268464 | 0.1473 | 0.7220 | 0.3468 | 0.2596 |
|
| 475 |
-
| 0.1582 | 409.0 | 269122 | 0.1473 | 0.7221 | 0.3472 | 0.2577 |
|
| 476 |
-
| 0.1582 | 410.0 | 269780 | 0.1473 | 0.7222 | 0.3471 | 0.2600 |
|
| 477 |
-
| 0.1582 | 411.0 | 270438 | 0.1473 | 0.7226 | 0.3474 | 0.2592 |
|
| 478 |
-
| 0.1583 | 412.0 | 271096 | 0.1473 | 0.7224 | 0.3474 | 0.2585 |
|
| 479 |
-
| 0.158 | 413.0 | 271754 | 0.1472 | 0.7227 | 0.3480 | 0.2585 |
|
| 480 |
-
| 0.1582 | 414.0 | 272412 | 0.1472 | 0.7219 | 0.3461 | 0.2600 |
|
| 481 |
-
| 0.1583 | 415.0 | 273070 | 0.1472 | 0.7225 | 0.3468 | 0.2592 |
|
| 482 |
-
| 0.1582 | 416.0 | 273728 | 0.1473 | 0.7226 | 0.3472 | 0.2592 |
|
| 483 |
-
| 0.1585 | 417.0 | 274386 | 0.1472 | 0.7226 | 0.3468 | 0.2592 |
|
| 484 |
-
| 0.1586 | 418.0 | 275044 | 0.1472 | 0.7226 | 0.3465 | 0.2600 |
|
| 485 |
-
| 0.1583 | 419.0 | 275702 | 0.1472 | 0.7223 | 0.3463 | 0.2592 |
|
| 486 |
-
| 0.1584 | 420.0 | 276360 | 0.1472 | 0.7223 | 0.3463 | 0.2588 |
|
| 487 |
-
| 0.1585 | 421.0 | 277018 | 0.1472 | 0.7224 | 0.3466 | 0.2592 |
|
| 488 |
-
| 0.1585 | 422.0 | 277676 | 0.1472 | 0.7223 | 0.3473 | 0.2604 |
|
| 489 |
-
| 0.1585 | 423.0 | 278334 | 0.1473 | 0.7220 | 0.3473 | 0.2573 |
|
| 490 |
-
| 0.1584 | 424.0 | 278992 | 0.1472 | 0.7221 | 0.3470 | 0.2592 |
|
| 491 |
-
| 0.1586 | 425.0 | 279650 | 0.1471 | 0.7222 | 0.3472 | 0.2596 |
|
| 492 |
-
| 0.1584 | 426.0 | 280308 | 0.1472 | 0.7229 | 0.3470 | 0.2592 |
|
| 493 |
-
| 0.1579 | 427.0 | 280966 | 0.1471 | 0.7230 | 0.3485 | 0.2596 |
|
| 494 |
-
| 0.1581 | 428.0 | 281624 | 0.1472 | 0.7227 | 0.3479 | 0.2585 |
|
| 495 |
-
| 0.1587 | 429.0 | 282282 | 0.1472 | 0.7225 | 0.3479 | 0.2588 |
|
| 496 |
-
| 0.1581 | 430.0 | 282940 | 0.1472 | 0.7230 | 0.3480 | 0.2585 |
|
| 497 |
-
| 0.1582 | 431.0 | 283598 | 0.1471 | 0.7225 | 0.3476 | 0.2596 |
|
| 498 |
-
| 0.158 | 432.0 | 284256 | 0.1471 | 0.7227 | 0.3479 | 0.2588 |
|
| 499 |
-
| 0.1585 | 433.0 | 284914 | 0.1471 | 0.7222 | 0.3472 | 0.2592 |
|
| 500 |
-
| 0.1584 | 434.0 | 285572 | 0.1471 | 0.7225 | 0.3475 | 0.2585 |
|
| 501 |
-
| 0.1581 | 435.0 | 286230 | 0.1472 | 0.7232 | 0.3491 | 0.2581 |
|
| 502 |
-
| 0.158 | 436.0 | 286888 | 0.1471 | 0.7227 | 0.3493 | 0.2596 |
|
| 503 |
-
| 0.1586 | 437.0 | 287546 | 0.1471 | 0.7230 | 0.3490 | 0.2588 |
|
| 504 |
-
| 0.1582 | 438.0 | 288204 | 0.1470 | 0.7231 | 0.3487 | 0.2596 |
|
| 505 |
-
| 0.1584 | 439.0 | 288862 | 0.1471 | 0.7228 | 0.3482 | 0.2588 |
|
| 506 |
-
| 0.1586 | 440.0 | 289520 | 0.1471 | 0.7229 | 0.3485 | 0.2600 |
|
| 507 |
-
| 0.1583 | 441.0 | 290178 | 0.1471 | 0.7227 | 0.3482 | 0.2588 |
|
| 508 |
-
| 0.1579 | 442.0 | 290836 | 0.1471 | 0.7225 | 0.3473 | 0.2592 |
|
| 509 |
-
| 0.158 | 443.0 | 291494 | 0.1470 | 0.7227 | 0.3468 | 0.2585 |
|
| 510 |
-
| 0.1582 | 444.0 | 292152 | 0.1471 | 0.7231 | 0.3484 | 0.2588 |
|
| 511 |
-
| 0.1585 | 445.0 | 292810 | 0.1470 | 0.7225 | 0.3487 | 0.2592 |
|
| 512 |
-
| 0.1583 | 446.0 | 293468 | 0.1470 | 0.7228 | 0.3483 | 0.2592 |
|
| 513 |
-
| 0.1582 | 447.0 | 294126 | 0.1470 | 0.7229 | 0.3490 | 0.2588 |
|
| 514 |
-
| 0.1583 | 448.0 | 294784 | 0.1470 | 0.7230 | 0.3474 | 0.2596 |
|
| 515 |
-
| 0.1579 | 449.0 | 295442 | 0.1471 | 0.7228 | 0.3485 | 0.2592 |
|
| 516 |
-
| 0.1584 | 450.0 | 296100 | 0.1471 | 0.7231 | 0.3487 | 0.2588 |
|
| 517 |
-
| 0.1586 | 451.0 | 296758 | 0.1470 | 0.7229 | 0.3489 | 0.2600 |
|
| 518 |
-
| 0.1582 | 452.0 | 297416 | 0.1470 | 0.7229 | 0.3490 | 0.2592 |
|
| 519 |
-
| 0.1583 | 453.0 | 298074 | 0.1470 | 0.7228 | 0.3488 | 0.2592 |
|
| 520 |
-
| 0.1578 | 454.0 | 298732 | 0.1470 | 0.7229 | 0.3486 | 0.2592 |
|
| 521 |
-
| 0.1579 | 455.0 | 299390 | 0.1470 | 0.7229 | 0.3480 | 0.2588 |
|
| 522 |
-
| 0.1583 | 456.0 | 300048 | 0.1470 | 0.7226 | 0.3476 | 0.2600 |
|
| 523 |
-
| 0.1579 | 457.0 | 300706 | 0.1470 | 0.7228 | 0.3478 | 0.2592 |
|
| 524 |
-
| 0.1579 | 458.0 | 301364 | 0.1470 | 0.7227 | 0.3474 | 0.2592 |
|
| 525 |
-
| 0.1581 | 459.0 | 302022 | 0.1470 | 0.7223 | 0.3469 | 0.2600 |
|
| 526 |
-
| 0.1585 | 460.0 | 302680 | 0.1470 | 0.7228 | 0.3483 | 0.2592 |
|
| 527 |
-
| 0.1577 | 461.0 | 303338 | 0.1469 | 0.7230 | 0.3492 | 0.2592 |
|
| 528 |
-
| 0.1584 | 462.0 | 303996 | 0.1469 | 0.7228 | 0.3482 | 0.2596 |
|
| 529 |
-
| 0.158 | 463.0 | 304654 | 0.1470 | 0.7230 | 0.3487 | 0.2596 |
|
| 530 |
-
| 0.158 | 464.0 | 305312 | 0.1470 | 0.7232 | 0.3490 | 0.2592 |
|
| 531 |
-
| 0.1585 | 465.0 | 305970 | 0.1470 | 0.7226 | 0.3481 | 0.2585 |
|
| 532 |
-
| 0.158 | 466.0 | 306628 | 0.1470 | 0.7234 | 0.3489 | 0.2588 |
|
| 533 |
-
| 0.1578 | 467.0 | 307286 | 0.1470 | 0.7233 | 0.3503 | 0.2600 |
|
| 534 |
-
| 0.1586 | 468.0 | 307944 | 0.1470 | 0.7232 | 0.3484 | 0.2588 |
|
| 535 |
-
| 0.1581 | 469.0 | 308602 | 0.1470 | 0.7229 | 0.3479 | 0.2585 |
|
| 536 |
-
| 0.1581 | 470.0 | 309260 | 0.1469 | 0.7230 | 0.3490 | 0.2608 |
|
| 537 |
-
| 0.1581 | 471.0 | 309918 | 0.1470 | 0.7229 | 0.3483 | 0.2596 |
|
| 538 |
-
| 0.1584 | 472.0 | 310576 | 0.1470 | 0.7232 | 0.3493 | 0.2596 |
|
| 539 |
-
| 0.1586 | 473.0 | 311234 | 0.1469 | 0.7228 | 0.3476 | 0.2604 |
|
| 540 |
-
| 0.1579 | 474.0 | 311892 | 0.1469 | 0.7226 | 0.3478 | 0.2608 |
|
| 541 |
-
| 0.1581 | 475.0 | 312550 | 0.1469 | 0.7226 | 0.3479 | 0.2585 |
|
| 542 |
-
| 0.1579 | 476.0 | 313208 | 0.1469 | 0.7226 | 0.3484 | 0.2604 |
|
| 543 |
-
| 0.1583 | 477.0 | 313866 | 0.1469 | 0.7228 | 0.3477 | 0.2596 |
|
| 544 |
-
| 0.1579 | 478.0 | 314524 | 0.1469 | 0.7232 | 0.3496 | 0.2596 |
|
| 545 |
-
| 0.1579 | 479.0 | 315182 | 0.1468 | 0.7227 | 0.3485 | 0.2616 |
|
| 546 |
-
| 0.1578 | 480.0 | 315840 | 0.1469 | 0.7228 | 0.3480 | 0.2604 |
|
| 547 |
-
| 0.1578 | 481.0 | 316498 | 0.1469 | 0.7225 | 0.3483 | 0.2592 |
|
| 548 |
-
| 0.1581 | 482.0 | 317156 | 0.1469 | 0.7226 | 0.3480 | 0.2600 |
|
| 549 |
-
| 0.1581 | 483.0 | 317814 | 0.1469 | 0.7232 | 0.3489 | 0.2592 |
|
| 550 |
-
| 0.1583 | 484.0 | 318472 | 0.1469 | 0.7231 | 0.3494 | 0.2585 |
|
| 551 |
-
| 0.1582 | 485.0 | 319130 | 0.1469 | 0.7233 | 0.3497 | 0.2592 |
|
| 552 |
-
| 0.1577 | 486.0 | 319788 | 0.1469 | 0.7231 | 0.3486 | 0.2596 |
|
| 553 |
-
| 0.1583 | 487.0 | 320446 | 0.1469 | 0.7229 | 0.3482 | 0.2592 |
|
| 554 |
-
| 0.1581 | 488.0 | 321104 | 0.1468 | 0.7230 | 0.3487 | 0.2600 |
|
| 555 |
-
| 0.1578 | 489.0 | 321762 | 0.1469 | 0.7233 | 0.3494 | 0.2592 |
|
| 556 |
-
| 0.1577 | 490.0 | 322420 | 0.1469 | 0.7232 | 0.3496 | 0.2585 |
|
| 557 |
-
| 0.1581 | 491.0 | 323078 | 0.1469 | 0.7233 | 0.3501 | 0.2596 |
|
| 558 |
-
| 0.1583 | 492.0 | 323736 | 0.1468 | 0.7228 | 0.3491 | 0.2604 |
|
| 559 |
-
| 0.1581 | 493.0 | 324394 | 0.1469 | 0.7238 | 0.3503 | 0.2604 |
|
| 560 |
-
| 0.1581 | 494.0 | 325052 | 0.1468 | 0.7233 | 0.3497 | 0.2616 |
|
| 561 |
-
| 0.158 | 495.0 | 325710 | 0.1468 | 0.7236 | 0.3497 | 0.2604 |
|
| 562 |
-
| 0.158 | 496.0 | 326368 | 0.1468 | 0.7240 | 0.3501 | 0.2604 |
|
| 563 |
-
| 0.1581 | 497.0 | 327026 | 0.1468 | 0.7236 | 0.3499 | 0.2592 |
|
| 564 |
-
| 0.1579 | 498.0 | 327684 | 0.1468 | 0.7235 | 0.3496 | 0.2596 |
|
| 565 |
-
| 0.1578 | 499.0 | 328342 | 0.1468 | 0.7236 | 0.3496 | 0.2600 |
|
| 566 |
-
| 0.158 | 500.0 | 329000 | 0.1468 | 0.7232 | 0.3494 | 0.2612 |
|
| 567 |
-
| 0.1576 | 501.0 | 329658 | 0.1468 | 0.7231 | 0.3486 | 0.2600 |
|
| 568 |
-
| 0.1581 | 502.0 | 330316 | 0.1468 | 0.7243 | 0.3506 | 0.2620 |
|
| 569 |
-
| 0.1576 | 503.0 | 330974 | 0.1468 | 0.7234 | 0.3495 | 0.2608 |
|
| 570 |
-
| 0.1584 | 504.0 | 331632 | 0.1467 | 0.7231 | 0.3487 | 0.2600 |
|
| 571 |
-
| 0.158 | 505.0 | 332290 | 0.1468 | 0.7234 | 0.3485 | 0.2600 |
|
| 572 |
-
| 0.1577 | 506.0 | 332948 | 0.1468 | 0.7235 | 0.3491 | 0.2592 |
|
| 573 |
-
| 0.1578 | 507.0 | 333606 | 0.1467 | 0.7237 | 0.3496 | 0.2608 |
|
| 574 |
-
| 0.158 | 508.0 | 334264 | 0.1468 | 0.7230 | 0.3480 | 0.2600 |
|
| 575 |
-
| 0.1583 | 509.0 | 334922 | 0.1468 | 0.7238 | 0.3498 | 0.2604 |
|
| 576 |
-
| 0.1583 | 510.0 | 335580 | 0.1467 | 0.7234 | 0.3492 | 0.2604 |
|
| 577 |
-
| 0.1578 | 511.0 | 336238 | 0.1467 | 0.7234 | 0.3494 | 0.2588 |
|
| 578 |
-
| 0.1583 | 512.0 | 336896 | 0.1468 | 0.7236 | 0.3499 | 0.2596 |
|
| 579 |
-
| 0.1579 | 513.0 | 337554 | 0.1467 | 0.7233 | 0.3485 | 0.2596 |
|
| 580 |
-
| 0.1579 | 514.0 | 338212 | 0.1468 | 0.7241 | 0.3500 | 0.2612 |
|
| 581 |
-
| 0.1579 | 515.0 | 338870 | 0.1468 | 0.7236 | 0.3496 | 0.2612 |
|
| 582 |
-
| 0.1579 | 516.0 | 339528 | 0.1467 | 0.7234 | 0.3486 | 0.2608 |
|
| 583 |
-
| 0.158 | 517.0 | 340186 | 0.1468 | 0.7239 | 0.3504 | 0.2604 |
|
| 584 |
-
| 0.1579 | 518.0 | 340844 | 0.1468 | 0.7241 | 0.3507 | 0.2604 |
|
| 585 |
-
| 0.1577 | 519.0 | 341502 | 0.1467 | 0.7236 | 0.3496 | 0.2604 |
|
| 586 |
-
| 0.1578 | 520.0 | 342160 | 0.1468 | 0.7234 | 0.3495 | 0.2604 |
|
| 587 |
-
| 0.1578 | 521.0 | 342818 | 0.1468 | 0.7239 | 0.3500 | 0.2612 |
|
| 588 |
-
| 0.1577 | 522.0 | 343476 | 0.1467 | 0.7241 | 0.3507 | 0.2600 |
|
| 589 |
-
| 0.1579 | 523.0 | 344134 | 0.1467 | 0.7232 | 0.3495 | 0.2585 |
|
| 590 |
-
| 0.1583 | 524.0 | 344792 | 0.1467 | 0.7237 | 0.3495 | 0.2608 |
|
| 591 |
-
| 0.158 | 525.0 | 345450 | 0.1467 | 0.7234 | 0.3491 | 0.2604 |
|
| 592 |
-
| 0.1583 | 526.0 | 346108 | 0.1467 | 0.7234 | 0.3494 | 0.2612 |
|
| 593 |
-
| 0.1583 | 527.0 | 346766 | 0.1467 | 0.7235 | 0.3497 | 0.2596 |
|
| 594 |
-
| 0.158 | 528.0 | 347424 | 0.1467 | 0.7236 | 0.3497 | 0.2604 |
|
| 595 |
-
| 0.158 | 529.0 | 348082 | 0.1467 | 0.7235 | 0.3501 | 0.2596 |
|
| 596 |
-
| 0.1579 | 530.0 | 348740 | 0.1467 | 0.7237 | 0.3500 | 0.2592 |
|
| 597 |
-
| 0.1582 | 531.0 | 349398 | 0.1467 | 0.7236 | 0.3494 | 0.2608 |
|
| 598 |
-
| 0.1581 | 532.0 | 350056 | 0.1468 | 0.7237 | 0.3503 | 0.2604 |
|
| 599 |
-
| 0.158 | 533.0 | 350714 | 0.1467 | 0.7239 | 0.3500 | 0.2604 |
|
| 600 |
-
| 0.1578 | 534.0 | 351372 | 0.1468 | 0.7238 | 0.3499 | 0.2596 |
|
| 601 |
-
| 0.1581 | 535.0 | 352030 | 0.1467 | 0.7234 | 0.3491 | 0.2592 |
|
| 602 |
-
| 0.158 | 536.0 | 352688 | 0.1467 | 0.7235 | 0.3495 | 0.2581 |
|
| 603 |
-
| 0.1576 | 537.0 | 353346 | 0.1467 | 0.7237 | 0.3502 | 0.2592 |
|
| 604 |
-
| 0.1579 | 538.0 | 354004 | 0.1466 | 0.7232 | 0.3491 | 0.2588 |
|
| 605 |
-
| 0.1581 | 539.0 | 354662 | 0.1467 | 0.7235 | 0.3491 | 0.2596 |
|
| 606 |
-
| 0.1578 | 540.0 | 355320 | 0.1467 | 0.7236 | 0.3499 | 0.2585 |
|
| 607 |
-
| 0.1579 | 541.0 | 355978 | 0.1467 | 0.7240 | 0.3505 | 0.2588 |
|
| 608 |
-
| 0.1582 | 542.0 | 356636 | 0.1467 | 0.7237 | 0.3501 | 0.2592 |
|
| 609 |
-
| 0.158 | 543.0 | 357294 | 0.1467 | 0.7238 | 0.3501 | 0.2592 |
|
| 610 |
-
| 0.1579 | 544.0 | 357952 | 0.1467 | 0.7243 | 0.3507 | 0.2604 |
|
| 611 |
-
| 0.1582 | 545.0 | 358610 | 0.1466 | 0.7239 | 0.3500 | 0.2600 |
|
| 612 |
-
| 0.1581 | 546.0 | 359268 | 0.1467 | 0.7238 | 0.3499 | 0.2596 |
|
| 613 |
-
| 0.1583 | 547.0 | 359926 | 0.1466 | 0.7235 | 0.3496 | 0.2604 |
|
| 614 |
-
| 0.158 | 548.0 | 360584 | 0.1466 | 0.7237 | 0.3501 | 0.2596 |
|
| 615 |
-
| 0.1582 | 549.0 | 361242 | 0.1467 | 0.7239 | 0.3507 | 0.2608 |
|
| 616 |
-
| 0.1577 | 550.0 | 361900 | 0.1467 | 0.7239 | 0.3503 | 0.2600 |
|
| 617 |
-
| 0.1573 | 551.0 | 362558 | 0.1466 | 0.7236 | 0.3499 | 0.2604 |
|
| 618 |
-
| 0.158 | 552.0 | 363216 | 0.1467 | 0.7239 | 0.3506 | 0.2608 |
|
| 619 |
-
| 0.158 | 553.0 | 363874 | 0.1467 | 0.7242 | 0.3505 | 0.2612 |
|
| 620 |
-
| 0.1577 | 554.0 | 364532 | 0.1466 | 0.7240 | 0.3502 | 0.2608 |
|
| 621 |
-
| 0.1577 | 555.0 | 365190 | 0.1466 | 0.7237 | 0.3499 | 0.2596 |
|
| 622 |
-
| 0.1573 | 556.0 | 365848 | 0.1466 | 0.7236 | 0.3503 | 0.2592 |
|
| 623 |
-
| 0.1577 | 557.0 | 366506 | 0.1467 | 0.7236 | 0.3499 | 0.2588 |
|
| 624 |
-
| 0.158 | 558.0 | 367164 | 0.1466 | 0.7235 | 0.3502 | 0.2585 |
|
| 625 |
-
| 0.1576 | 559.0 | 367822 | 0.1466 | 0.7238 | 0.3504 | 0.2596 |
|
| 626 |
-
| 0.1579 | 560.0 | 368480 | 0.1466 | 0.7235 | 0.3500 | 0.2596 |
|
| 627 |
-
| 0.1574 | 561.0 | 369138 | 0.1466 | 0.7236 | 0.3497 | 0.2600 |
|
| 628 |
-
| 0.1576 | 562.0 | 369796 | 0.1467 | 0.7241 | 0.3506 | 0.2608 |
|
| 629 |
-
| 0.1579 | 563.0 | 370454 | 0.1466 | 0.7241 | 0.3507 | 0.2608 |
|
| 630 |
-
| 0.1576 | 564.0 | 371112 | 0.1466 | 0.7237 | 0.3498 | 0.2604 |
|
| 631 |
-
| 0.1579 | 565.0 | 371770 | 0.1466 | 0.7243 | 0.3507 | 0.2608 |
|
| 632 |
-
| 0.1579 | 566.0 | 372428 | 0.1466 | 0.7238 | 0.3504 | 0.2604 |
|
| 633 |
-
| 0.1577 | 567.0 | 373086 | 0.1467 | 0.7239 | 0.3501 | 0.2592 |
|
| 634 |
-
| 0.1579 | 568.0 | 373744 | 0.1467 | 0.7243 | 0.3507 | 0.2600 |
|
| 635 |
-
| 0.1577 | 569.0 | 374402 | 0.1466 | 0.7238 | 0.3502 | 0.2604 |
|
| 636 |
-
| 0.1575 | 570.0 | 375060 | 0.1466 | 0.7240 | 0.3503 | 0.2612 |
|
| 637 |
-
| 0.1581 | 571.0 | 375718 | 0.1466 | 0.7238 | 0.3505 | 0.2592 |
|
| 638 |
-
| 0.1578 | 572.0 | 376376 | 0.1466 | 0.7238 | 0.3502 | 0.2612 |
|
| 639 |
-
| 0.1583 | 573.0 | 377034 | 0.1466 | 0.7240 | 0.3501 | 0.2600 |
|
| 640 |
-
| 0.1576 | 574.0 | 377692 | 0.1466 | 0.7238 | 0.3500 | 0.2592 |
|
| 641 |
-
| 0.1573 | 575.0 | 378350 | 0.1466 | 0.7239 | 0.3499 | 0.2608 |
|
| 642 |
-
| 0.1573 | 576.0 | 379008 | 0.1465 | 0.7238 | 0.3498 | 0.2600 |
|
| 643 |
-
| 0.1573 | 577.0 | 379666 | 0.1466 | 0.7239 | 0.3500 | 0.2608 |
|
| 644 |
-
| 0.158 | 578.0 | 380324 | 0.1466 | 0.7239 | 0.3502 | 0.2592 |
|
| 645 |
-
| 0.1578 | 579.0 | 380982 | 0.1466 | 0.7238 | 0.3498 | 0.2604 |
|
| 646 |
-
| 0.1581 | 580.0 | 381640 | 0.1466 | 0.7237 | 0.3498 | 0.2600 |
|
| 647 |
-
| 0.158 | 581.0 | 382298 | 0.1466 | 0.7238 | 0.3500 | 0.2600 |
|
| 648 |
-
| 0.1573 | 582.0 | 382956 | 0.1465 | 0.7238 | 0.3503 | 0.2604 |
|
| 649 |
-
| 0.1586 | 583.0 | 383614 | 0.1465 | 0.7238 | 0.3502 | 0.2600 |
|
| 650 |
-
| 0.1577 | 584.0 | 384272 | 0.1465 | 0.7239 | 0.3505 | 0.2596 |
|
| 651 |
-
| 0.1574 | 585.0 | 384930 | 0.1466 | 0.7239 | 0.3506 | 0.2596 |
|
| 652 |
-
| 0.1577 | 586.0 | 385588 | 0.1465 | 0.7238 | 0.3501 | 0.2604 |
|
| 653 |
-
| 0.1574 | 587.0 | 386246 | 0.1465 | 0.7241 | 0.3505 | 0.2604 |
|
| 654 |
-
| 0.1578 | 588.0 | 386904 | 0.1465 | 0.7237 | 0.3495 | 0.2596 |
|
| 655 |
-
| 0.158 | 589.0 | 387562 | 0.1466 | 0.7240 | 0.3502 | 0.2600 |
|
| 656 |
-
| 0.158 | 590.0 | 388220 | 0.1466 | 0.7241 | 0.3508 | 0.2604 |
|
| 657 |
-
| 0.1574 | 591.0 | 388878 | 0.1466 | 0.7243 | 0.3510 | 0.2596 |
|
| 658 |
-
| 0.1576 | 592.0 | 389536 | 0.1465 | 0.7234 | 0.3496 | 0.2596 |
|
| 659 |
-
| 0.1582 | 593.0 | 390194 | 0.1465 | 0.7239 | 0.3504 | 0.2592 |
|
| 660 |
-
| 0.158 | 594.0 | 390852 | 0.1465 | 0.7240 | 0.3505 | 0.2588 |
|
| 661 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 662 |
|
| 663 |
### Framework versions
|
| 664 |
|
|
|
|
| 2 |
library_name: transformers
|
| 3 |
base_model: cardiffnlp/twitter-xlm-roberta-base-sentiment
|
| 4 |
tags:
|
| 5 |
+
- text-classification
|
| 6 |
+
- multi-label-classification
|
| 7 |
+
- multi-head-classification
|
| 8 |
+
- disaster-response
|
| 9 |
+
- humanitarian-aid
|
| 10 |
+
- social-media
|
| 11 |
- twitter
|
| 12 |
+
- generated_from_trainer
|
|
|
|
|
|
|
| 13 |
model-index:
|
| 14 |
- name: xlm-roberta-sentiment-requests
|
| 15 |
+
results:
|
| 16 |
+
- task:
|
| 17 |
+
type: text-classification
|
| 18 |
+
dataset:
|
| 19 |
+
name: community-datasets/disaster_response_messages
|
| 20 |
+
type: community-datasets
|
| 21 |
+
config: default
|
| 22 |
+
split: evaluation
|
| 23 |
+
metrics:
|
| 24 |
+
- name: F1 Micro
|
| 25 |
+
type: f1
|
| 26 |
+
value: 0.7240
|
| 27 |
+
- name: F1 Macro
|
| 28 |
+
type: f1
|
| 29 |
+
value: 0.3505
|
| 30 |
+
- name: Subset Accuracy
|
| 31 |
+
type: accuracy
|
| 32 |
+
value: 0.2588
|
| 33 |
datasets:
|
| 34 |
- community-datasets/disaster_response_messages
|
| 35 |
pipeline_tag: text-classification
|
| 36 |
language:
|
|
|
|
| 37 |
- en
|
| 38 |
+
- multilingual
|
| 39 |
---
|
| 40 |
|
| 41 |
+
<!-- This model card has been generated automatically and then completed by a human. -->
|
|
|
|
| 42 |
|
| 43 |
# xlm-roberta-sentiment-requests
|
| 44 |
|
| 45 |
+
This model is a fine-tuned version of [cardiffnlp/twitter-xlm-roberta-base-sentiment](https://huggingface.co/cardiffnlp/twitter-xlm-roberta-base-sentiment) on the `community-datasets/disaster_response_messages` dataset. It has been adapted into a powerful **multi-head classification model** designed to analyze messages from social media during disaster events.
|
| 46 |
It achieves the following results on the evaluation set:
|
| 47 |
- Loss: 0.1465
|
| 48 |
- F1 Micro: 0.7240
|
|
|
|
| 51 |
|
| 52 |
## Model description
|
| 53 |
|
| 54 |
+
This model uses a shared `XLM-RoBERTa` base to encode input text. The resulting text representation is then fed into two separate, independent classification layers (heads):
|
| 55 |
+
* A **Sentiment Head** with 3 outputs for `positive`, `neutral`, and `negative` classes.
|
| 56 |
+
* A **Multi-Label Head** with 41 outputs, which are decoded to predict the presence or absence of 37 different disaster-related categories.
|
| 57 |
+
|
| 58 |
+
This dual-head architecture allows for a nuanced understanding of a message, capturing both its emotional content and its specific, actionable information.
|
| 59 |
|
| 60 |
## Intended uses & limitations
|
| 61 |
|
| 62 |
+
This model is intended for organizations and researchers involved in humanitarian aid and disaster response. Potential applications include:
|
| 63 |
+
* **Automated Triage**: Quickly sorting through thousands of social media messages to identify the most urgent requests for help.
|
| 64 |
+
* **Situational Awareness**: Building a real-time map of needs by aggregating categorized messages.
|
| 65 |
+
* **Resource Allocation**: Directing resources more effectively by understanding the specific types of aid being requested.
|
| 66 |
+
|
| 67 |
+
**Important**: Due to its custom architecture, this model **cannot** be used with the standard `pipeline("text-classification")` function. Please see the usage code below for the correct implementation.
|
| 68 |
+
|
| 69 |
+
### How to Use
|
| 70 |
+
This model requires custom code to handle its two-headed output. The following is a complete, self-contained Python script to run inference. You will need to have `transformers`, `torch`, and `safetensors` installed (`pip install transformers torch safetensors`).
|
| 71 |
+
|
| 72 |
+
The script is broken into logical blocks:
|
| 73 |
+
|
| 74 |
+
1. **Model Architecture**: A Python class that defines the model's structure. This blueprint is required to load the saved weights.
|
| 75 |
+
2. **Label Definitions**: A "decoder ring" of functions to translate the model's numerical outputs into human-readable labels.
|
| 76 |
+
3. **Setup & Loading**: A function that handles all the one-time setup.
|
| 77 |
+
4. **Prediction Function**: The core logic that takes text and produces a dictionary of predictions.
|
| 78 |
+
5. **Main Execution**: An example of how to run the script.
|
| 79 |
+
|
| 80 |
+
By copying the codes below from 1 to 5, you will be able to run the entire inference pipeline with all outputs.
|
| 81 |
+
|
| 82 |
+
***
|
| 83 |
+
1. **Model Architecture**: We define the necessary imports and the model architecture.
|
| 84 |
+
```python
|
| 85 |
+
import torch
|
| 86 |
+
from torch import nn
|
| 87 |
+
from transformers import AutoTokenizer, AutoConfig, AutoModel, PreTrainedModel
|
| 88 |
+
from huggingface_hub import hf_hub_download
|
| 89 |
+
from typing import Dict, Any
|
| 90 |
+
from safetensors.torch import load_file
|
| 91 |
+
|
| 92 |
+
class MultiHeadClassificationModel(PreTrainedModel):
|
| 93 |
+
def __init__(self, config, **kwargs):
|
| 94 |
+
super().__init__(config)
|
| 95 |
+
num_multilabels = kwargs.get("num_multilabels")
|
| 96 |
+
if num_multilabels is None:
|
| 97 |
+
raise ValueError("`num_multilabels` must be provided to initialize the model.")
|
| 98 |
+
self.backbone = AutoModel.from_config(config)
|
| 99 |
+
self.sentiment_classifier = nn.Linear(config.hidden_size, config.num_sentiment_labels)
|
| 100 |
+
self.multilabel_classifier = nn.Linear(config.hidden_size, num_multilabels)
|
| 101 |
+
self.init_weights()
|
| 102 |
+
|
| 103 |
+
def forward(self, input_ids=None, attention_mask=None, **kwargs):
|
| 104 |
+
outputs = self.backbone(input_ids, attention_mask=attention_mask, **kwargs)
|
| 105 |
+
cls_token_output = outputs.last_hidden_state[:, 0, :]
|
| 106 |
+
sentiment_logits = self.sentiment_classifier(cls_token_output)
|
| 107 |
+
multilabel_logits = self.multilabel_classifier(cls_token_output)
|
| 108 |
+
return {"sentiment_logits": sentiment_logits, "multilabel_logits": multilabel_logits}
|
| 109 |
+
```
|
| 110 |
+
***
|
| 111 |
+
2. **Label Definitions**: We embed the label definitions, which are essential for interpreting the model's output.
|
| 112 |
+
```python
|
| 113 |
+
def get_all_labels() -> Dict[str, Dict[int, str]]:
|
| 114 |
+
return {
|
| 115 |
+
'sentiment': get_sentiment_labels(), 'genre': get_genre_labels(), 'related': get_related_labels(),
|
| 116 |
+
'request': get_request_labels(), 'offer': get_offer_labels(), 'aid_related': get_aid_related_labels(),
|
| 117 |
+
'medical_help': get_medical_help_labels(), 'medical_products': get_medical_products_labels(),
|
| 118 |
+
'search_and_rescue': get_search_and_rescue_labels(), 'security': get_security_labels(),
|
| 119 |
+
'military': get_military_labels(), 'child_alone': get_child_alone_labels(), 'water': get_water_labels(),
|
| 120 |
+
'food': get_food_labels(), 'shelter': get_shelter_labels(), 'clothing': get_clothing_labels(),
|
| 121 |
+
'money': get_money_labels(), 'missing_people': get_missing_people_labels(),
|
| 122 |
+
'refugees': get_refugees_labels(), 'death': get_death_labels(), 'other_aid': get_other_aid_labels(),
|
| 123 |
+
'infrastructure_related': get_infrastructure_related_labels(), 'transport': get_transport_labels(),
|
| 124 |
+
'buildings': get_buildings_labels(), 'electricity': get_electricity_labels(), 'tools': get_tools_labels(),
|
| 125 |
+
'hospitals': get_hospitals_labels(), 'shops': get_shops_labels(), 'aid_centers': get_aid_centers_labels(),
|
| 126 |
+
'other_infrastructure': get_other_infrastructure_labels(), 'weather_related': get_weather_related_labels(),
|
| 127 |
+
'floods': get_floods_labels(), 'storm': get_storm_labels(), 'fire': get_fire_labels(),
|
| 128 |
+
'earthquake': get_earthquake_labels(), 'cold': get_cold_labels(), 'other_weather': get_other_weather_labels(),
|
| 129 |
+
'direct_report': get_direct_report_labels(),
|
| 130 |
+
}
|
| 131 |
+
def get_genre_labels() -> Dict[int, str]: return {0: 'direct', 1: 'news', 2: 'social'}
|
| 132 |
+
def get_related_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes', 2: 'maybe'}
|
| 133 |
+
def get_request_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes'}
|
| 134 |
+
def get_offer_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes'}
|
| 135 |
+
def get_aid_related_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes'}
|
| 136 |
+
def get_medical_help_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes'}
|
| 137 |
+
def get_medical_products_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes'}
|
| 138 |
+
def get_search_and_rescue_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes'}
|
| 139 |
+
def get_security_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes'}
|
| 140 |
+
def get_military_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes'}
|
| 141 |
+
def get_child_alone_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes'}
|
| 142 |
+
def get_water_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes'}
|
| 143 |
+
def get_food_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes'}
|
| 144 |
+
def get_shelter_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes'}
|
| 145 |
+
def get_clothing_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes'}
|
| 146 |
+
def get_money_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes'}
|
| 147 |
+
def get_missing_people_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes'}
|
| 148 |
+
def get_refugees_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes'}
|
| 149 |
+
def get_death_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes'}
|
| 150 |
+
def get_other_aid_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes'}
|
| 151 |
+
def get_infrastructure_related_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes'}
|
| 152 |
+
def get_transport_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes'}
|
| 153 |
+
def get_buildings_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes'}
|
| 154 |
+
def get_electricity_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes'}
|
| 155 |
+
def get_tools_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes'}
|
| 156 |
+
def get_hospitals_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes'}
|
| 157 |
+
def get_shops_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes'}
|
| 158 |
+
def get_aid_centers_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes'}
|
| 159 |
+
def get_other_infrastructure_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes'}
|
| 160 |
+
def get_weather_related_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes'}
|
| 161 |
+
def get_floods_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes'}
|
| 162 |
+
def get_storm_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes'}
|
| 163 |
+
def get_fire_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes'}
|
| 164 |
+
def get_earthquake_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes'}
|
| 165 |
+
def get_cold_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes'}
|
| 166 |
+
def get_other_weather_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes'}
|
| 167 |
+
def get_direct_report_labels() -> Dict[int, str]: return {0: 'no', 1: 'yes'}
|
| 168 |
+
def get_sentiment_labels() -> Dict[int, str]: return {0: 'negative', 1: 'neutral', 2: 'positive'}
|
| 169 |
+
```
|
| 170 |
+
***
|
| 171 |
+
3. **Setup & Loading**: This setup function handles loading all components and reconstructing the necessary metadata.
|
| 172 |
+
```python
|
| 173 |
+
def load_essentials():
|
| 174 |
+
hub_repo_id = "spencercdz/xlm-roberta-sentiment-requests"
|
| 175 |
+
subfolder = "final_model"
|
| 176 |
+
device = "cuda" if torch.cuda.is_available() else "cpu"
|
| 177 |
+
|
| 178 |
+
all_labels_map = get_all_labels()
|
| 179 |
+
binary_tasks = [k for k, v in all_labels_map.items() if len(v) == 2 and k not in ['related', 'sentiment']]
|
| 180 |
+
multiclass_tasks = {k: len(v) for k, v in all_labels_map.items() if len(v) > 2}
|
| 181 |
+
|
| 182 |
+
column_names = [f"{t}_{i}" for t, n in multiclass_tasks.items() for i in range(n)] + binary_tasks
|
| 183 |
+
multilabel_column_names = sorted(column_names)
|
| 184 |
+
num_multilabels = len(multilabel_column_names)
|
| 185 |
+
num_sentiment_labels = len(get_sentiment_labels())
|
| 186 |
+
|
| 187 |
+
tokenizer = AutoTokenizer.from_pretrained(hub_repo_id, subfolder=subfolder)
|
| 188 |
+
config = AutoConfig.from_pretrained(hub_repo_id, subfolder=subfolder)
|
| 189 |
+
config.num_sentiment_labels = num_sentiment_labels
|
| 190 |
+
|
| 191 |
+
model_shell = MultiHeadClassificationModel(config=config, num_multilabels=num_multilabels)
|
| 192 |
+
weights_path = hf_hub_download(repo_id=hub_repo_id, filename="model.safetensors", subfolder=subfolder)
|
| 193 |
+
state_dict = load_file(weights_path, device=device)
|
| 194 |
+
model_shell.load_state_dict(state_dict, strict=False)
|
| 195 |
+
model = model_shell.to(device)
|
| 196 |
+
model.eval()
|
| 197 |
+
|
| 198 |
+
metadata = {
|
| 199 |
+
"binary_tasks": binary_tasks, "multiclass_tasks": multiclass_tasks,
|
| 200 |
+
"multilabel_column_names": multilabel_column_names,
|
| 201 |
+
"all_labels": all_labels_map, "device": device
|
| 202 |
+
}
|
| 203 |
+
return model, tokenizer, metadata
|
| 204 |
+
```
|
| 205 |
+
***
|
| 206 |
+
4. **Prediction Function**: The prediction function takes the loaded components and input text to produce a decoded dictionary.
|
| 207 |
+
```python
|
| 208 |
+
def predict(text: str, model, tokenizer, metadata: Dict) -> Dict[str, Any]:
|
| 209 |
+
inputs = tokenizer(text, return_tensors="pt", truncation=True, padding=True, max_length=512).to(metadata['device'])
|
| 210 |
+
with torch.no_grad():
|
| 211 |
+
outputs = model(**inputs)
|
| 212 |
+
|
| 213 |
+
sentiment_probs = torch.softmax(outputs['sentiment_logits'], dim=-1).cpu().numpy()
|
| 214 |
+
multilabel_probs = torch.sigmoid(outputs['multilabel_logits']).cpu().numpy()
|
| 215 |
+
|
| 216 |
+
results = {}
|
| 217 |
+
sentiment_decoder = metadata['all_labels']['sentiment']
|
| 218 |
+
sentiment_pred_idx = sentiment_probs.argmax()
|
| 219 |
+
results['sentiment'] = {'prediction': sentiment_decoder.get(sentiment_pred_idx, "unknown"), 'confidence': sentiment_probs[0, sentiment_pred_idx].item()}
|
| 220 |
+
|
| 221 |
+
for task_name in metadata['binary_tasks']:
|
| 222 |
+
idx = metadata['multilabel_column_names'].index(task_name)
|
| 223 |
+
prob = multilabel_probs[0, idx]
|
| 224 |
+
pred = 1 if prob > 0.5 else 0
|
| 225 |
+
results[task_name] = {'prediction': metadata['all_labels'][task_name][pred], 'confidence': (prob if pred == 1 else 1 - prob).item()}
|
| 226 |
+
|
| 227 |
+
for task_name, num_classes in metadata['multiclass_tasks'].items():
|
| 228 |
+
start_idx = metadata['multilabel_column_names'].index(f"{task_name}_0")
|
| 229 |
+
task_probs = multilabel_probs[0, start_idx : start_idx + num_classes]
|
| 230 |
+
pred_idx = task_probs.argmax()
|
| 231 |
+
results[task_name] = {'prediction': metadata['all_labels'][task_name].get(pred_idx, "unknown"), 'confidence': task_probs[pred_idx].item()}
|
| 232 |
+
|
| 233 |
+
return results
|
| 234 |
+
```
|
| 235 |
+
***
|
| 236 |
+
5. **Main Execution**: The main execution block shows how to use the functions and print the raw JSON output.
|
| 237 |
+
```python
|
| 238 |
+
if __name__ == "__main__":
|
| 239 |
+
model, tokenizer, metadata = load_essentials()
|
| 240 |
+
input_text = "I need food, water, and shelter. Help me! People are dying. We need more items."
|
| 241 |
+
|
| 242 |
+
print(f"\n--- Predicting for Input ---\n\"{input_text}\"")
|
| 243 |
+
|
| 244 |
+
predictions = predict(input_text, model, tokenizer, metadata)
|
| 245 |
+
|
| 246 |
+
# Print the raw dictionary output
|
| 247 |
+
print("\n--- RAW DICTIONARY OUTPUT ---")
|
| 248 |
+
print(predictions)
|
| 249 |
+
```
|
| 250 |
+
|
| 251 |
+
### Sample Output
|
| 252 |
+
```
|
| 253 |
+
{'sentiment': {'prediction': 'negative', 'confidence': 0.999014139175415}, 'request': {'prediction': 'yes', 'confidence': 0.9999805688858032}, 'offer': {'prediction': 'no', 'confidence': 0.9995545148849487}, 'aid_related': {'prediction': 'yes', 'confidence': 0.9995179176330566}, 'medical_help': {'prediction': 'no', 'confidence': 0.9931818246841431}, 'medical_products': {'prediction': 'no', 'confidence': 0.9975765943527222}, 'search_and_rescue': {'prediction': 'no', 'confidence': 0.9981554746627808}, 'security': {'prediction': 'no', 'confidence': 0.999071478843689}, 'military': {'prediction': 'no', 'confidence': 0.9981452226638794}, 'child_alone': {'prediction': 'no', 'confidence': 0.9998688697814941}, 'water': {'prediction': 'yes', 'confidence': 0.9991873502731323}, 'food': {'prediction': 'yes', 'confidence': 0.9998394250869751}, 'shelter': {'prediction': 'yes', 'confidence': 0.9997198581695557}, 'clothing': {'prediction': 'no', 'confidence': 0.9982467889785767}, 'money': {'prediction': 'no', 'confidence': 0.9985392093658447}, 'missing_people': {'prediction': 'no', 'confidence': 0.998404324054718}, 'refugees': {'prediction': 'no', 'confidence': 0.9981242418289185}, 'death': {'prediction': 'yes', 'confidence': 0.9850122332572937}, 'other_aid': {'prediction': 'no', 'confidence': 0.9654157757759094}, 'infrastructure_related': {'prediction': 'no', 'confidence': 0.984534740447998}, 'transport': {'prediction': 'no', 'confidence': 0.9972304105758667}, 'buildings': {'prediction': 'no', 'confidence': 0.9881182312965393}, 'electricity': {'prediction': 'no', 'confidence': 0.9988776445388794}, 'tools': {'prediction': 'no', 'confidence': 0.9995874166488647}, 'hospitals': {'prediction': 'no', 'confidence': 0.999099850654602}, 'shops': {'prediction': 'no', 'confidence': 0.9996023178100586}, 'aid_centers': {'prediction': 'no', 'confidence': 0.9981774091720581}, 'other_infrastructure': {'prediction': 'no', 'confidence': 0.9968826770782471}, 'weather_related': {'prediction': 'no', 'confidence': 0.9632836580276489}, 'floods': {'prediction': 'no', 'confidence': 0.9960920810699463}, 'storm': {'prediction': 'no', 'confidence': 0.9963870048522949}, 'fire': {'prediction': 'no', 'confidence': 0.9993714094161987}, 'earthquake': {'prediction': 'no', 'confidence': 0.99778151512146}, 'cold': {'prediction': 'no', 'confidence': 0.9991660118103027}, 'other_weather': {'prediction': 'no', 'confidence': 0.9974269866943359}, 'direct_report': {'prediction': 'yes', 'confidence': 0.9763266444206238}, 'genre': {'prediction': 'direct', 'confidence': 0.9912198185920715}, 'related': {'prediction': 'yes', 'confidence': 0.9997092485427856}}
|
| 254 |
+
```
|
| 255 |
|
| 256 |
## Training and evaluation data
|
| 257 |
|
| 258 |
+
This model was fine-tuned on the `community-datasets/disaster_response_messages` dataset, which contains over 26,000 messages from real disaster events. Each message is labeled with 37 different categories, such as `aid_related` and `weather_related`, as well as the message `genre` (direct, news, social). The `sentiment` labels were added programmatically for the purpose of this multi-task training.
|
| 259 |
+
|
| 260 |
+
The dataset was split into:
|
| 261 |
+
* Training set: ~21,000 samples
|
| 262 |
+
* Validation set: ~2,600 samples
|
| 263 |
+
* Test set: ~2,600 samples
|
| 264 |
|
| 265 |
## Training procedure
|
| 266 |
|
| 267 |
+
The model was trained using the `transformers.Trainer` with a custom `MultiHeadClassificationModel` architecture. The training process optimized a combined loss from both the sentiment and multi-label classification heads. The best model was selected based on the `F1 Micro` score on the validation set.
|
| 268 |
+
|
| 269 |
### Training hyperparameters
|
| 270 |
|
| 271 |
The following hyperparameters were used during training:
|
|
|
|
| 275 |
- seed: 42
|
| 276 |
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
| 277 |
- lr_scheduler_type: linear
|
| 278 |
+
- num_epochs: 1000 (early stopping patience of 50 epochs)
|
| 279 |
- mixed_precision_training: Native AMP
|
| 280 |
|
| 281 |
### Training results
|
| 282 |
|
| 283 |
+
The final results on the evaluation set are based on the best checkpoint at epoch 594. A truncated history of the 50 most important rows are shown below.
|
| 284 |
+
For the full data, please refer to [training_log.csv](https://huggingface.co/spencercdz/xlm-roberta-sentiment-requests/blob/main/training_log.csv) in the repository.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 285 |
|
| 286 |
+
| Training Loss | Epoch | Step | Validation Loss | F1 Micro | F1 Macro | Subset Accuracy |
|
| 287 |
+
|:-------------:|:-----:|:-------:|:---------------:|:--------:|:--------:|:---------------:|
|
| 288 |
+
| 0.4267 | 1.0 | 658 | 0.2727 | 0.4953 | 0.0722 | 0.1053 |
|
| 289 |
+
| 0.2662 | 2.0 | 1316 | 0.2291 | 0.5446 | 0.0906 | 0.1123 |
|
| 290 |
+
| 0.2366 | 3.0 | 1974 | 0.2143 | 0.5682 | 0.1031 | 0.1279 |
|
| 291 |
+
| 0.2234 | 4.0 | 2632 | 0.2058 | 0.5878 | 0.1160 | 0.1333 |
|
| 292 |
+
| 0.2156 | 5.0 | 3290 | 0.1997 | 0.6022 | 0.1255 | 0.1380 |
|
| 293 |
+
| ... | ... | ... | ... | ... | ... | ... |
|
| 294 |
+
| 0.1773 | 25.0 | 16450 | 0.1670 | 0.6714 | 0.2305 | 0.1955 |
|
| 295 |
+
| 0.1694 | 50.0 | 32900 | 0.1592 | 0.6911 | 0.2701 | 0.2223 |
|
| 296 |
+
| 0.1662 | 75.0 | 49350 | 0.1558 | 0.7018 | 0.2960 | 0.2309 |
|
| 297 |
+
| 0.164 | 100.0 | 65800 | 0.1537 | 0.7077 | 0.3098 | 0.2425 |
|
| 298 |
+
| 0.1627 | 125.0 | 82250 | 0.1522 | 0.7104 | 0.3184 | 0.2449 |
|
| 299 |
+
| 0.1617 | 150.0 | 98700 | 0.1513 | 0.7130 | 0.3243 | 0.2449 |
|
| 300 |
+
| 0.1612 | 175.0 | 115150 | 0.1504 | 0.7143 | 0.3285 | 0.2499 |
|
| 301 |
+
| 0.1606 | 200.0 | 131600 | 0.1498 | 0.7161 | 0.3314 | 0.2515 |
|
| 302 |
+
| 0.16 | 250.0 | 164500 | 0.1488 | 0.7183 | 0.3383 | 0.2538 |
|
| 303 |
+
| 0.1592 | 300.0 | 197400 | 0.1482 | 0.7204 | 0.3423 | 0.2534 |
|
| 304 |
+
| 0.1589 | 350.0 | 230300 | 0.1476 | 0.7214 | 0.3450 | 0.2581 |
|
| 305 |
+
| 0.1584 | 400.0 | 263200 | 0.1474 | 0.7223 | 0.3459 | 0.2588 |
|
| 306 |
+
| 0.1584 | 450.0 | 296100 | 0.1471 | 0.7231 | 0.3487 | 0.2588 |
|
| 307 |
+
| 0.158 | 500.0 | 329000 | 0.1468 | 0.7232 | 0.3494 | 0.2612 |
|
| 308 |
+
| 0.1577 | 550.0 | 361900 | 0.1467 | 0.7239 | 0.3503 | 0.2600 |
|
| 309 |
+
| ... | ... | ... | ... | ... | ... | ... |
|
| 310 |
+
| 0.1574 | 591.0 | 388878 | 0.1466 | 0.7243 | 0.3510 | 0.2596 |
|
| 311 |
+
| 0.1576 | 592.0 | 389536 | 0.1465 | 0.7234 | 0.3496 | 0.2596 |
|
| 312 |
+
| 0.1582 | 593.0 | 390194 | 0.1465 | 0.7239 | 0.3504 | 0.2592 |
|
| 313 |
+
| 0.158 | 594.0 | 390852 | 0.1465 | 0.7240 | 0.3505 | 0.2588 |
|
| 314 |
|
| 315 |
### Framework versions
|
| 316 |
|