File size: 8,492 Bytes
4b677fc
28d9d8f
 
 
e4ecbec
 
 
2d01399
 
 
4b677fc
 
2d01399
 
 
 
 
 
 
28d9d8f
 
 
 
 
 
 
2d01399
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
28d9d8f
2d01399
 
 
 
 
 
28d9d8f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
2d01399
 
 
 
28d9d8f
2d01399
 
28d9d8f
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
---
base_model: openai/whisper-medium
datasets:
- generator
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: whisper-medium-sb-lug-eng
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# whisper-medium-sb-lug-eng

This model is a fine-tuned version of [openai/whisper-medium](https://huggingface.co/openai/whisper-medium) on the generator dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1720
- Wer Lug: 0.81
- Wer Eng: 0.068
- Wer Mean: 0.439
- Cer Lug: 0.494
- Cer Eng: 0.039
- Cer Mean: 0.267

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- training_steps: 30000
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch  | Step  | Validation Loss | Wer Lug | Wer Eng | Wer Mean | Cer Lug | Cer Eng | Cer Mean |
|:-------------:|:------:|:-----:|:---------------:|:-------:|:-------:|:--------:|:-------:|:-------:|:--------:|
| 0.9804        | 0.0167 | 500   | 0.3683          | 0.692   | 0.043   | 0.368    | 0.203   | 0.019   | 0.111    |
| 0.7775        | 0.0333 | 1000  | 0.2594          | 0.725   | 0.044   | 0.385    | 0.395   | 0.019   | 0.207    |
| 0.6492        | 0.05   | 1500  | 0.2316          | 0.649   | 0.041   | 0.345    | 0.263   | 0.02    | 0.142    |
| 0.6128        | 0.0667 | 2000  | 0.2111          | 0.513   | 0.04    | 0.277    | 0.197   | 0.018   | 0.108    |
| 0.543         | 0.0833 | 2500  | 0.2023          | 0.579   | 0.043   | 0.311    | 0.239   | 0.018   | 0.129    |
| 0.5461        | 0.1    | 3000  | 0.1932          | 0.425   | 0.04    | 0.233    | 0.138   | 0.019   | 0.078    |
| 0.5545        | 0.1167 | 3500  | 0.1836          | 0.624   | 0.043   | 0.334    | 0.381   | 0.021   | 0.201    |
| 0.4895        | 0.1333 | 4000  | 0.1802          | 0.407   | 0.043   | 0.225    | 0.156   | 0.022   | 0.089    |
| 0.4922        | 0.15   | 4500  | 0.1771          | 0.377   | 0.051   | 0.214    | 0.136   | 0.033   | 0.084    |
| 0.521         | 0.1667 | 5000  | 0.1817          | 0.316   | 0.049   | 0.183    | 0.097   | 0.028   | 0.062    |
| 0.3948        | 1.0153 | 5500  | 0.1724          | 0.422   | 0.079   | 0.251    | 0.17    | 0.057   | 0.113    |
| 0.3914        | 1.032  | 6000  | 0.1727          | 0.744   | 0.04    | 0.392    | 0.651   | 0.018   | 0.334    |
| 0.3807        | 1.0487 | 6500  | 0.1730          | 0.585   | 0.053   | 0.319    | 0.428   | 0.028   | 0.228    |
| 0.395         | 1.0653 | 7000  | 0.1701          | 0.737   | 0.043   | 0.39     | 0.635   | 0.024   | 0.329    |
| 0.3774        | 1.082  | 7500  | 0.1654          | 0.545   | 0.046   | 0.296    | 0.396   | 0.024   | 0.21     |
| 0.4017        | 1.0987 | 8000  | 0.1626          | 0.465   | 0.046   | 0.256    | 0.28    | 0.024   | 0.152    |
| 0.3901        | 1.1153 | 8500  | 0.1593          | 0.516   | 0.051   | 0.283    | 0.25    | 0.026   | 0.138    |
| 0.3829        | 1.1320 | 9000  | 0.1608          | 0.48    | 0.049   | 0.264    | 0.247   | 0.024   | 0.135    |
| 0.3536        | 1.1487 | 9500  | 0.1657          | 0.37    | 0.043   | 0.207    | 0.143   | 0.021   | 0.082    |
| 0.3506        | 1.1653 | 10000 | 0.1606          | 0.395   | 0.041   | 0.218    | 0.172   | 0.021   | 0.097    |
| 0.2737        | 2.014  | 10500 | 0.1604          | 0.457   | 0.07    | 0.263    | 0.235   | 0.044   | 0.139    |
| 0.3073        | 2.0307 | 11000 | 0.1626          | 0.458   | 0.046   | 0.252    | 0.243   | 0.022   | 0.132    |
| 0.2906        | 2.0473 | 11500 | 0.1581          | 0.444   | 0.062   | 0.253    | 0.222   | 0.038   | 0.13     |
| 0.2882        | 2.064  | 12000 | 0.1591          | 0.519   | 0.053   | 0.286    | 0.3     | 0.024   | 0.162    |
| 0.2642        | 2.0807 | 12500 | 0.1630          | 0.547   | 0.05    | 0.299    | 0.293   | 0.029   | 0.161    |
| 0.2848        | 2.0973 | 13000 | 0.1627          | 0.509   | 0.055   | 0.282    | 0.244   | 0.03    | 0.137    |
| 0.2887        | 2.114  | 13500 | 0.1585          | 0.524   | 0.067   | 0.296    | 0.28    | 0.047   | 0.163    |
| 0.2879        | 2.1307 | 14000 | 0.1593          | 0.646   | 0.065   | 0.356    | 0.355   | 0.045   | 0.2      |
| 0.2955        | 2.1473 | 14500 | 0.1581          | 0.873   | 0.062   | 0.468    | 0.512   | 0.038   | 0.275    |
| 0.2639        | 2.164  | 15000 | 0.1533          | 0.772   | 0.057   | 0.414    | 0.454   | 0.037   | 0.245    |
| 0.2111        | 3.0127 | 15500 | 0.1622          | 0.776   | 0.074   | 0.425    | 0.518   | 0.046   | 0.282    |
| 0.2299        | 3.0293 | 16000 | 0.1628          | 0.849   | 0.061   | 0.455    | 0.559   | 0.036   | 0.297    |
| 0.2279        | 3.046  | 16500 | 0.1633          | 0.803   | 0.064   | 0.434    | 0.632   | 0.036   | 0.334    |
| 0.2339        | 3.0627 | 17000 | 0.1617          | 0.845   | 0.045   | 0.445    | 0.553   | 0.022   | 0.288    |
| 0.2387        | 3.0793 | 17500 | 0.1599          | 0.773   | 0.055   | 0.414    | 0.436   | 0.029   | 0.232    |
| 0.2098        | 3.096  | 18000 | 0.1616          | 0.675   | 0.059   | 0.367    | 0.45    | 0.037   | 0.243    |
| 0.2201        | 3.1127 | 18500 | 0.1619          | 0.713   | 0.066   | 0.389    | 0.476   | 0.039   | 0.257    |
| 0.2312        | 3.1293 | 19000 | 0.1603          | 0.994   | 0.053   | 0.524    | 0.605   | 0.03    | 0.318    |
| 0.2389        | 3.146  | 19500 | 0.1572          | 0.751   | 0.054   | 0.403    | 0.455   | 0.032   | 0.244    |
| 0.2183        | 3.1627 | 20000 | 0.1635          | 0.667   | 0.056   | 0.362    | 0.42    | 0.034   | 0.227    |
| 0.1707        | 4.0113 | 20500 | 0.1654          | 0.682   | 0.05    | 0.366    | 0.433   | 0.026   | 0.23     |
| 0.1874        | 4.028  | 21000 | 0.1641          | 0.744   | 0.054   | 0.399    | 0.425   | 0.03    | 0.228    |
| 0.1836        | 4.0447 | 21500 | 0.1666          | 0.651   | 0.063   | 0.357    | 0.397   | 0.039   | 0.218    |
| 0.1847        | 4.0613 | 22000 | 0.1635          | 0.788   | 0.069   | 0.429    | 0.502   | 0.044   | 0.273    |
| 0.1742        | 4.078  | 22500 | 0.1651          | 0.695   | 0.051   | 0.373    | 0.4     | 0.027   | 0.214    |
| 0.1733        | 4.0947 | 23000 | 0.1652          | 0.678   | 0.064   | 0.371    | 0.427   | 0.039   | 0.233    |
| 0.1651        | 4.1113 | 23500 | 0.1659          | 0.666   | 0.071   | 0.369    | 0.458   | 0.046   | 0.252    |
| 0.1924        | 4.128  | 24000 | 0.1664          | 0.792   | 0.069   | 0.431    | 0.486   | 0.046   | 0.266    |
| 0.1828        | 4.1447 | 24500 | 0.1670          | 0.746   | 0.068   | 0.407    | 0.538   | 0.043   | 0.291    |
| 0.165         | 4.1613 | 25000 | 0.1675          | 0.746   | 0.072   | 0.409    | 0.469   | 0.047   | 0.258    |
| 0.1437        | 5.01   | 25500 | 0.1706          | 0.728   | 0.066   | 0.397    | 0.481   | 0.04    | 0.261    |
| 0.148         | 5.0267 | 26000 | 0.1700          | 0.755   | 0.069   | 0.412    | 0.457   | 0.041   | 0.249    |
| 0.1509        | 5.0433 | 26500 | 0.1700          | 0.787   | 0.068   | 0.427    | 0.497   | 0.039   | 0.268    |
| 0.1442        | 5.06   | 27000 | 0.1715          | 0.762   | 0.068   | 0.415    | 0.47    | 0.039   | 0.254    |
| 0.1282        | 5.0767 | 27500 | 0.1698          | 0.796   | 0.064   | 0.43     | 0.477   | 0.037   | 0.257    |
| 0.1377        | 5.0933 | 28000 | 0.1710          | 0.796   | 0.068   | 0.432    | 0.481   | 0.04    | 0.261    |
| 0.1456        | 5.11   | 28500 | 0.1719          | 0.758   | 0.07    | 0.414    | 0.481   | 0.04    | 0.26     |
| 0.143         | 5.1267 | 29000 | 0.1716          | 0.795   | 0.07    | 0.433    | 0.488   | 0.04    | 0.264    |
| 0.1484        | 5.1433 | 29500 | 0.1719          | 0.812   | 0.069   | 0.44     | 0.492   | 0.04    | 0.266    |
| 0.1463        | 5.16   | 30000 | 0.1720          | 0.81    | 0.068   | 0.439    | 0.494   | 0.039   | 0.267    |


### Framework versions

- Transformers 4.42.4
- Pytorch 2.2.0
- Datasets 2.20.0
- Tokenizers 0.19.1