Zlovoblachko commited on
Commit
441ebb1
·
verified ·
1 Parent(s): 262f1da

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +0 -68
README.md CHANGED
@@ -1,68 +0,0 @@
1
- ---
2
- library_name: transformers
3
- license: apache-2.0
4
- base_model: t5-base
5
- tags:
6
- - generated_from_trainer
7
- metrics:
8
- - bleu
9
- model-index:
10
- - name: t5-grammar-corrector
11
- results: []
12
- ---
13
-
14
- <!-- This model card has been generated automatically according to the information the Trainer had access to. You
15
- should probably proofread and complete it, then remove this comment. -->
16
-
17
- # t5-grammar-corrector
18
-
19
- This model is a fine-tuned version of [t5-base](https://huggingface.co/t5-base) on an unknown dataset.
20
- It achieves the following results on the evaluation set:
21
- - Loss: 0.0300
22
- - Exact Match: 0.1374
23
- - Bleu: 58.1578
24
- - M2 Precision: 0.6745
25
- - M2 Recall: 50
26
- - M2 Fscore: 0.8403
27
-
28
- ## Model description
29
-
30
- More information needed
31
-
32
- ## Intended uses & limitations
33
-
34
- More information needed
35
-
36
- ## Training and evaluation data
37
-
38
- More information needed
39
-
40
- ## Training procedure
41
-
42
- ### Training hyperparameters
43
-
44
- The following hyperparameters were used during training:
45
- - learning_rate: 3e-05
46
- - train_batch_size: 8
47
- - eval_batch_size: 8
48
- - seed: 42
49
- - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
50
- - lr_scheduler_type: linear
51
- - num_epochs: 3
52
- - mixed_precision_training: Native AMP
53
-
54
- ### Training results
55
-
56
- | Training Loss | Epoch | Step | Validation Loss | Exact Match | Bleu | M2 Precision | M2 Recall | M2 Fscore |
57
- |:-------------:|:-----:|:-----:|:---------------:|:-----------:|:-------:|:------------:|:---------:|:---------:|
58
- | 0.0265 | 1.0 | 3559 | 0.0331 | 0.1343 | 58.3430 | 33.2490 | 50 | 35.6368 |
59
- | 0.0293 | 2.0 | 7118 | 0.0313 | 0.1417 | 58.5254 | 29.6796 | 50 | 32.3054 |
60
- | 0.0284 | 3.0 | 10677 | 0.0313 | 0.1419 | 58.5583 | 30.5509 | 50 | 33.1281 |
61
-
62
-
63
- ### Framework versions
64
-
65
- - Transformers 4.51.3
66
- - Pytorch 2.6.0+cu124
67
- - Datasets 3.5.0
68
- - Tokenizers 0.21.1