File size: 2,191 Bytes
e3e242f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
72916cf
e3e242f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
a6cb3ce
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
---
base_model: NazaGara/NER-fine-tuned-BETO
tags:
- generated_from_trainer
datasets:
- conll2002
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: beto-finetuned-ner-1
  results:
  - task:
      name: Token Classification
      type: token-classification
    dataset:
      name: conll2002
      type: conll2002
      config: es
      split: validation
      args: es
    metrics:
    - name: Precision
      type: precision
      value: 0.861199
    - name: Recall
      type: recall
      value: 0.871094
    - name: F1
      type: f1
      value: 0.866118
    - name: Accuracy
      type: accuracy
      value: 0.972756
---
# beto-finetuned-ner-1

Este es modelo resultado de un finetuning de 
[NazaGara/NER-fine-tuned-BETO](https://huggingface.co/NazaGara/NER-fine-tuned-BETO) sobre el conll2002 dataset.
Los siguientes son los resultados sobre el conjunto de evaluación:
- Loss: 0.002421
- Precision: 0.861199
- Recall: 0.871094
- F1: 0.8851
- Accuracy: 0,972756

## Model description

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- weight_decay: 0.001
- num_epochs: 8

### Training results

| Epoch | Training Loss | Validation Loss | Precision | Recall  | F1      | Accuracy |
|:-----:|:-------------:|:---------------:|:---------:|:-------:|:-------:|:--------:|
| 1     | 0.004500      | 0.271499        | 0.854365  | 0.868107| 0.861181| 0.971268 |
| 2     | 0.004000      | 0.283811        | 0.839605  | 0.840763| 0.840184| 0.966170 |
| 3     | 0.003900      | 0.261076        | 0.849651  | 0.867417| 0.858442| 0.970664 |
| 4     | 0.002600      | 0.277270        | 0.858379  | 0.866268| 0.862306| 0.971702 |
| 5     | 0.002000      | 0.270548        | 0.859149  | 0.871783| 0.865420| 0.971563 |
| 6     | 0.001800      | 0.279797        | 0.857305  | 0.868336| 0.862785| 0.971609 |
| 7     | 0.001800      | 0.281091        | 0.857467  | 0.868107| 0.862754| 0.971966 |
| 8     | 0.001100      | 0.284128        | 0.861199  | 0.871094| 0.866118| 0.972756 |