File size: 9,059 Bytes
cb08885
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
de4756a
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
cb08885
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
de4756a
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
cb08885
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
---

library_name: transformers
license: apache-2.0
base_model: albert/albert-base-v2
tags:
- generated_from_trainer
model-index:
- name: albert-base-v2-2-contract-sections-classification-v4-10
  results: []
---


<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

[<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/mvgdr/classificacao-secoes-contratos-v4-albert-base/runs/1ngfioq6)
# albert-base-v2-2-contract-sections-classification-v4-10

This model is a fine-tuned version of [albert/albert-base-v2](https://huggingface.co/albert/albert-base-v2) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8591
- Accuracy Evaluate: 0.7843
- Precision Evaluate: 0.8045
- Recall Evaluate: 0.7910
- F1 Evaluate: 0.7930
- Accuracy Sklearn: 0.7843
- Precision Sklearn: 0.7973
- Recall Sklearn: 0.7843
- F1 Sklearn: 0.7854
- Acuracia Rotulo Objeto: 0.8698
- Acuracia Rotulo Obrigacoes: 0.8670
- Acuracia Rotulo Valor: 0.6046
- Acuracia Rotulo Vigencia: 0.5984
- Acuracia Rotulo Rescisao: 0.7839
- Acuracia Rotulo Foro: 0.9
- Acuracia Rotulo Reajuste: 0.8185
- Acuracia Rotulo Fiscalizacao: 0.6656
- Acuracia Rotulo Publicacao: 0.8227
- Acuracia Rotulo Pagamento: 0.7717
- Acuracia Rotulo Casos Omissos: 0.8522
- Acuracia Rotulo Sancoes: 0.8716
- Acuracia Rotulo Dotacao Orcamentaria: 0.8571

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 1e-06

- train_batch_size: 16

- eval_batch_size: 16

- seed: 42

- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments

- lr_scheduler_type: linear

- num_epochs: 10

### Training results

| Training Loss | Epoch | Step  | Validation Loss | Accuracy Evaluate | Precision Evaluate | Recall Evaluate | F1 Evaluate | Accuracy Sklearn | Precision Sklearn | Recall Sklearn | F1 Sklearn | Acuracia Rotulo Objeto | Acuracia Rotulo Obrigacoes | Acuracia Rotulo Valor | Acuracia Rotulo Vigencia | Acuracia Rotulo Rescisao | Acuracia Rotulo Foro | Acuracia Rotulo Reajuste | Acuracia Rotulo Fiscalizacao | Acuracia Rotulo Publicacao | Acuracia Rotulo Pagamento | Acuracia Rotulo Casos Omissos | Acuracia Rotulo Sancoes | Acuracia Rotulo Dotacao Orcamentaria |
|:-------------:|:-----:|:-----:|:---------------:|:-----------------:|:------------------:|:---------------:|:-----------:|:----------------:|:-----------------:|:--------------:|:----------:|:----------------------:|:--------------------------:|:---------------------:|:------------------------:|:------------------------:|:--------------------:|:------------------------:|:----------------------------:|:--------------------------:|:-------------------------:|:-----------------------------:|:-----------------------:|:------------------------------------:|
| 1.7695        | 1.0   | 1000  | 1.9528          | 0.4447            | 0.6693             | 0.3931          | 0.3947      | 0.4447           | 0.6282            | 0.4447         | 0.4028     | 0.9029                 | 0.8148                     | 0.4069                | 0.0577                   | 0.2521                   | 0.8692               | 0.2064                   | 0.0789                       | 0.5025                     | 0.2283                    | 0.4335                        | 0.3303                  | 0.0275                               |
| 1.2261        | 2.0   | 2000  | 1.6110          | 0.5755            | 0.6694             | 0.5569          | 0.5614      | 0.5755           | 0.6584            | 0.5755         | 0.5645     | 0.8244                 | 0.8013                     | 0.4097                | 0.3963                   | 0.3601                   | 0.9077               | 0.6299                   | 0.1956                       | 0.6059                     | 0.6051                    | 0.5714                        | 0.6514                  | 0.2802                               |
| 0.9134        | 3.0   | 3000  | 1.3841          | 0.6538            | 0.7003             | 0.6524          | 0.6478      | 0.6538           | 0.6991            | 0.6538         | 0.6473     | 0.8161                 | 0.8182                     | 0.4585                | 0.4724                   | 0.4377                   | 0.9038               | 0.7794                   | 0.3218                       | 0.7438                     | 0.7428                    | 0.7635                        | 0.7339                  | 0.4890                               |
| 0.7022        | 4.0   | 4000  | 1.2144          | 0.6935            | 0.7370             | 0.6945          | 0.6965      | 0.6935           | 0.7265            | 0.6935         | 0.6904     | 0.8244                 | 0.8333                     | 0.4470                | 0.5407                   | 0.6842                   | 0.8923               | 0.8043                   | 0.3596                       | 0.7488                     | 0.7210                    | 0.6453                        | 0.8349                  | 0.6923                               |
| 0.5597        | 5.0   | 5000  | 1.0738          | 0.7288            | 0.7560             | 0.7406          | 0.7386      | 0.7288           | 0.7494            | 0.7288         | 0.7286     | 0.8574                 | 0.7845                     | 0.5358                | 0.5171                   | 0.7147                   | 0.9                  | 0.8292                   | 0.5110                       | 0.7783                     | 0.7355                    | 0.8128                        | 0.8716                  | 0.7802                               |
| 0.4429        | 6.0   | 6000  | 0.9868          | 0.7552            | 0.7772             | 0.7641          | 0.7626      | 0.7552           | 0.7731            | 0.7552         | 0.7556     | 0.8678                 | 0.8350                     | 0.5874                | 0.5328                   | 0.7368                   | 0.8846               | 0.8399                   | 0.5584                       | 0.8227                     | 0.7790                    | 0.8276                        | 0.8807                  | 0.7802                               |
| 0.389         | 7.0   | 7000  | 0.9236          | 0.7615            | 0.7823             | 0.7701          | 0.7683      | 0.7615           | 0.7763            | 0.7615         | 0.7603     | 0.8616                 | 0.8620                     | 0.6017                | 0.5538                   | 0.7479                   | 0.8962               | 0.8185                   | 0.5142                       | 0.8227                     | 0.7681                    | 0.8473                        | 0.8716                  | 0.8462                               |
| 0.3341        | 8.0   | 8000  | 0.8949          | 0.776             | 0.7961             | 0.7833          | 0.7845      | 0.776            | 0.7900            | 0.776          | 0.7771     | 0.8781                 | 0.8519                     | 0.6017                | 0.5722                   | 0.7729                   | 0.8962               | 0.8078                   | 0.6656                       | 0.8227                     | 0.7536                    | 0.8424                        | 0.8716                  | 0.8462                               |
| 0.3099        | 9.0   | 9000  | 0.8650          | 0.7805            | 0.8039             | 0.7876          | 0.7905      | 0.7805           | 0.7956            | 0.7805         | 0.7819     | 0.8822                 | 0.8552                     | 0.6160                | 0.5748                   | 0.7756                   | 0.8962               | 0.8114                   | 0.6593                       | 0.8227                     | 0.7754                    | 0.8522                        | 0.8716                  | 0.8462                               |
| 0.3016        | 10.0  | 10000 | 0.8591          | 0.7843            | 0.8045             | 0.7910          | 0.7930      | 0.7843           | 0.7973            | 0.7843         | 0.7854     | 0.8698                 | 0.8670                     | 0.6046                | 0.5984                   | 0.7839                   | 0.9                  | 0.8185                   | 0.6656                       | 0.8227                     | 0.7717                    | 0.8522                        | 0.8716                  | 0.8571                               |


### Framework versions

- Transformers 4.48.3
- Pytorch 2.6.0+cu124
- Datasets 3.3.0
- Tokenizers 0.21.0