Update README.md
Browse files
README.md
CHANGED
@@ -17,6 +17,7 @@ tags:
|
|
17 |
[](https://colab.research.google.com/gist/waveletdeboshir/07e39ae96f27331aa3e1e053c2c2f9e8/gigaam-ctc-hf-with-lm.ipynb)
|
18 |
|
19 |
# GigaAM-v2-CTC with ngram LM and beamsearch 🤗 Hugging Face transformers
|
|
|
20 |
|
21 |
* original git https://github.com/salute-developers/GigaAM
|
22 |
* ngram LM from [`bond005/wav2vec2-large-ru-golos-with-lm`](https://huggingface.co/bond005/wav2vec2-large-ru-golos-with-lm)
|
@@ -24,7 +25,7 @@ tags:
|
|
24 |
Russian ASR model GigaAM-v2-CTC with external ngram LM and beamsearch decoding.
|
25 |
|
26 |
## Model info
|
27 |
-
This is
|
28 |
In addition it can be use to extract word-level timestamps.
|
29 |
|
30 |
File [`gigaam_transformers.py`](https://huggingface.co/waveletdeboshir/gigaam-ctc-with-lm/blob/main/gigaam_transformers.py) contains model, feature extractor and tokenizer classes with usual transformers methods. Model can be initialized with transformers auto classes (see an example below).
|
|
|
17 |
[](https://colab.research.google.com/gist/waveletdeboshir/07e39ae96f27331aa3e1e053c2c2f9e8/gigaam-ctc-hf-with-lm.ipynb)
|
18 |
|
19 |
# GigaAM-v2-CTC with ngram LM and beamsearch 🤗 Hugging Face transformers
|
20 |
+
This is an **unofficial Transformers wrapper** for the original GigaAM model released by SberDevices.
|
21 |
|
22 |
* original git https://github.com/salute-developers/GigaAM
|
23 |
* ngram LM from [`bond005/wav2vec2-large-ru-golos-with-lm`](https://huggingface.co/bond005/wav2vec2-large-ru-golos-with-lm)
|
|
|
25 |
Russian ASR model GigaAM-v2-CTC with external ngram LM and beamsearch decoding.
|
26 |
|
27 |
## Model info
|
28 |
+
This is GigaAM-v2-CTC with `transformers` library interface, beamsearch decoding and hypothesis rescoring with external ngram LM.
|
29 |
In addition it can be use to extract word-level timestamps.
|
30 |
|
31 |
File [`gigaam_transformers.py`](https://huggingface.co/waveletdeboshir/gigaam-ctc-with-lm/blob/main/gigaam_transformers.py) contains model, feature extractor and tokenizer classes with usual transformers methods. Model can be initialized with transformers auto classes (see an example below).
|