Update README.md
Browse files
    	
        README.md
    CHANGED
    
    | @@ -31,15 +31,18 @@ This model is a fine-tuned version of [google/efficientnet-b5](https://huggingfa | |
| 31 | 
             
            It achieves the following results on the evaluation set:
         | 
| 32 | 
             
            - Loss: 0.9410
         | 
| 33 | 
             
            - Accuracy: 0.8020
         | 
| 34 | 
            -
            -  | 
| 35 | 
            -
            -  | 
| 36 | 
            -
            -  | 
| 37 | 
            -
            -  | 
| 38 | 
            -
            -  | 
| 39 | 
            -
            -  | 
| 40 | 
            -
            -  | 
| 41 | 
            -
            -  | 
| 42 | 
            -
            -  | 
|  | |
|  | |
|  | |
| 43 |  | 
| 44 | 
             
            <div style="text-align: center;">
         | 
| 45 | 
             
            <h2>
         | 
| @@ -93,7 +96,7 @@ The following hyperparameters were used during training: | |
| 93 |  | 
| 94 | 
             
            ### Training results
         | 
| 95 |  | 
| 96 | 
            -
            | Training Loss | Epoch | Step | Validation Loss | Accuracy | Weighted  | 
| 97 | 
             
            |:-------------:|:-----:|:----:|:---------------:|:--------:|:-----------:|:--------:|:--------:|:---------------:|:------------:|:------------:|:------------------:|:---------------:|:---------------:|
         | 
| 98 | 
             
            | 1.3872        | 1.0   | 180  | 1.0601          | 0.6853   | 0.6485      | 0.6853   | 0.6550   | 0.6853          | 0.6853       | 0.6802       | 0.8177             | 0.6853          | 0.8330          |
         | 
| 99 | 
             
            | 1.3872        | 2.0   | 360  | 0.9533          | 0.7843   | 0.7483      | 0.7843   | 0.7548   | 0.7843          | 0.7843       | 0.7819       | 0.8354             | 0.7843          | 0.8471          |
         | 
|  | |
| 31 | 
             
            It achieves the following results on the evaluation set:
         | 
| 32 | 
             
            - Loss: 0.9410
         | 
| 33 | 
             
            - Accuracy: 0.8020
         | 
| 34 | 
            +
            - F1
         | 
| 35 | 
            +
              - Weighted: 0.7736
         | 
| 36 | 
            +
              - Micro: 0.8020
         | 
| 37 | 
            +
              - Macro: 0.7802
         | 
| 38 | 
            +
            - Recall
         | 
| 39 | 
            +
              - Weighted: 0.8020
         | 
| 40 | 
            +
              - Micro: 0.8020
         | 
| 41 | 
            +
              - Macro: 0.7977
         | 
| 42 | 
            +
            - Precision
         | 
| 43 | 
            +
              - Weighted: 0.8535
         | 
| 44 | 
            +
              - Micro: 0.8020
         | 
| 45 | 
            +
              - Macro: 0.8682
         | 
| 46 |  | 
| 47 | 
             
            <div style="text-align: center;">
         | 
| 48 | 
             
            <h2>
         | 
|  | |
| 96 |  | 
| 97 | 
             
            ### Training results
         | 
| 98 |  | 
| 99 | 
            +
            | Training Loss | Epoch | Step | Validation Loss | Accuracy | Weighted F1 | Micro F1 | Macro F1 | Weighted Recall | Micro Recall | Macro Recall | Weighted Precision | Micro Precision | Macro Precision |
         | 
| 100 | 
             
            |:-------------:|:-----:|:----:|:---------------:|:--------:|:-----------:|:--------:|:--------:|:---------------:|:------------:|:------------:|:------------------:|:---------------:|:---------------:|
         | 
| 101 | 
             
            | 1.3872        | 1.0   | 180  | 1.0601          | 0.6853   | 0.6485      | 0.6853   | 0.6550   | 0.6853          | 0.6853       | 0.6802       | 0.8177             | 0.6853          | 0.8330          |
         | 
| 102 | 
             
            | 1.3872        | 2.0   | 360  | 0.9533          | 0.7843   | 0.7483      | 0.7843   | 0.7548   | 0.7843          | 0.7843       | 0.7819       | 0.8354             | 0.7843          | 0.8471          |
         | 
