File size: 5,562 Bytes
0a9d877
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
---
base_model: alexyalunin/RuBioRoBERTa
tags:
- generated_from_trainer
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: RuBioRoBERTa_pos
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# RuBioRoBERTa_pos

This model is a fine-tuned version of [alexyalunin/RuBioRoBERTa](https://huggingface.co/alexyalunin/RuBioRoBERTa) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5510
- Precision: 0.6388
- Recall: 0.5954
- F1: 0.6163
- Accuracy: 0.9111

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100

### Training results

| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1     | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| No log        | 1.0   | 50   | 0.6556          | 0.0       | 0.0    | 0.0    | 0.7611   |
| No log        | 2.0   | 100  | 1.2213          | 0.0011    | 0.0019 | 0.0014 | 0.2513   |
| No log        | 3.0   | 150  | 0.6117          | 0.0       | 0.0    | 0.0    | 0.7642   |
| No log        | 4.0   | 200  | 0.5155          | 0.0135    | 0.0405 | 0.0203 | 0.7884   |
| No log        | 5.0   | 250  | 0.4171          | 0.0697    | 0.1715 | 0.0991 | 0.8268   |
| No log        | 6.0   | 300  | 0.3536          | 0.1054    | 0.1753 | 0.1317 | 0.8594   |
| No log        | 7.0   | 350  | 0.3714          | 0.1638    | 0.2216 | 0.1884 | 0.8685   |
| No log        | 8.0   | 400  | 0.2889          | 0.2477    | 0.3622 | 0.2942 | 0.8864   |
| No log        | 9.0   | 450  | 0.2943          | 0.2799    | 0.3969 | 0.3283 | 0.8921   |
| 0.452         | 10.0  | 500  | 0.2916          | 0.3823    | 0.4817 | 0.4263 | 0.9011   |
| 0.452         | 11.0  | 550  | 0.3162          | 0.3329    | 0.4817 | 0.3937 | 0.8935   |
| 0.452         | 12.0  | 600  | 0.3245          | 0.3629    | 0.4971 | 0.4195 | 0.9040   |
| 0.452         | 13.0  | 650  | 0.3535          | 0.4022    | 0.4913 | 0.4423 | 0.9021   |
| 0.452         | 14.0  | 700  | 0.3313          | 0.4161    | 0.5588 | 0.4770 | 0.9023   |
| 0.452         | 15.0  | 750  | 0.3560          | 0.4210    | 0.5800 | 0.4878 | 0.9006   |
| 0.452         | 16.0  | 800  | 0.3980          | 0.4125    | 0.6224 | 0.4962 | 0.8905   |
| 0.452         | 17.0  | 850  | 0.3767          | 0.4820    | 0.6455 | 0.5519 | 0.9071   |
| 0.452         | 18.0  | 900  | 0.3947          | 0.4605    | 0.6513 | 0.5395 | 0.9034   |
| 0.452         | 19.0  | 950  | 0.4351          | 0.4395    | 0.5877 | 0.5029 | 0.9066   |
| 0.0844        | 20.0  | 1000 | 0.3581          | 0.4931    | 0.5530 | 0.5213 | 0.9097   |
| 0.0844        | 21.0  | 1050 | 0.4050          | 0.4892    | 0.6108 | 0.5433 | 0.9063   |
| 0.0844        | 22.0  | 1100 | 0.4893          | 0.5504    | 0.5472 | 0.5488 | 0.9076   |
| 0.0844        | 23.0  | 1150 | 0.4173          | 0.4722    | 0.6050 | 0.5304 | 0.9062   |
| 0.0844        | 24.0  | 1200 | 0.4307          | 0.4819    | 0.6146 | 0.5402 | 0.9075   |
| 0.0844        | 25.0  | 1250 | 0.3874          | 0.4977    | 0.6185 | 0.5515 | 0.9151   |
| 0.0844        | 26.0  | 1300 | 0.4591          | 0.5478    | 0.6513 | 0.5951 | 0.9130   |
| 0.0844        | 27.0  | 1350 | 0.3543          | 0.5308    | 0.5973 | 0.5621 | 0.9144   |
| 0.0844        | 28.0  | 1400 | 0.4676          | 0.5380    | 0.5453 | 0.5416 | 0.9187   |
| 0.0844        | 29.0  | 1450 | 0.4169          | 0.5365    | 0.6224 | 0.5763 | 0.9131   |
| 0.0401        | 30.0  | 1500 | 0.4394          | 0.5867    | 0.5607 | 0.5734 | 0.9114   |
| 0.0401        | 31.0  | 1550 | 0.4550          | 0.5446    | 0.6474 | 0.5915 | 0.9166   |
| 0.0401        | 32.0  | 1600 | 0.4592          | 0.5415    | 0.6166 | 0.5766 | 0.9125   |
| 0.0401        | 33.0  | 1650 | 0.5040          | 0.5218    | 0.6455 | 0.5771 | 0.9093   |
| 0.0401        | 34.0  | 1700 | 0.4609          | 0.4295    | 0.6686 | 0.5230 | 0.8989   |
| 0.0401        | 35.0  | 1750 | 0.6256          | 0.4833    | 0.6397 | 0.5506 | 0.8975   |
| 0.0401        | 36.0  | 1800 | 0.4697          | 0.5742    | 0.6185 | 0.5955 | 0.9088   |
| 0.0401        | 37.0  | 1850 | 0.5114          | 0.5645    | 0.6069 | 0.5850 | 0.9139   |
| 0.0401        | 38.0  | 1900 | 0.5884          | 0.6237    | 0.5780 | 0.6    | 0.9088   |
| 0.0401        | 39.0  | 1950 | 0.5022          | 0.5429    | 0.6455 | 0.5898 | 0.9135   |
| 0.0328        | 40.0  | 2000 | 0.4154          | 0.6315    | 0.6339 | 0.6327 | 0.9202   |
| 0.0328        | 41.0  | 2050 | 0.3940          | 0.5519    | 0.6146 | 0.5816 | 0.9145   |
| 0.0328        | 42.0  | 2100 | 0.3374          | 0.5477    | 0.6301 | 0.5860 | 0.9120   |
| 0.0328        | 43.0  | 2150 | 0.5907          | 0.5483    | 0.5029 | 0.5246 | 0.9041   |
| 0.0328        | 44.0  | 2200 | 0.4235          | 0.5606    | 0.6416 | 0.5984 | 0.9145   |
| 0.0328        | 45.0  | 2250 | 0.6646          | 0.0       | 0.0    | 0.0    | 0.7640   |


### Framework versions

- Transformers 4.38.2
- Pytorch 2.1.2
- Datasets 2.1.0
- Tokenizers 0.15.2