Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
masters-thesis-vm
/
whispered_TIA_base_encoder_freezing_nosil
like
0
Follow
master's thesis Vivien Meyer
3
Automatic Speech Recognition
Transformers
PyTorch
English
whisper
speech processing
nlp
asr
domain adaptation
arxiv:
2212.04356
License:
unknown
Model card
Files
Files and versions
Community
1
Train
Deploy
Use this model
main
whispered_TIA_base_encoder_freezing_nosil
Ctrl+K
Ctrl+K
1 contributor
History:
11 commits
vimey
Update README.md
de60c6e
almost 2 years ago
results
Upload 18 files
almost 2 years ago
scripts
Upload 18 files
almost 2 years ago
.gitattributes
Safe
1.52 kB
initial commit
almost 2 years ago
README.md
Safe
3.06 kB
Update README.md
almost 2 years ago
added_tokens.json
Safe
2.08 kB
Upload 18 files
almost 2 years ago
config.json
1.28 kB
Upload 18 files
almost 2 years ago
generation_config.json
Safe
3.77 kB
Upload 18 files
almost 2 years ago
merges.txt
Safe
494 kB
Upload 18 files
almost 2 years ago
normalizer.json
Safe
52.7 kB
Upload 18 files
almost 2 years ago
optimizer.pt
Safe
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
What is a pickle import?
416 MB
LFS
Upload 18 files
almost 2 years ago
preprocessor_config.json
Safe
339 Bytes
Upload 18 files
almost 2 years ago
pytorch_model.bin
Safe
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch.FloatStorage"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
290 MB
LFS
Upload 18 files
almost 2 years ago
requirements.txt
Safe
1.75 kB
Upload 18 files
almost 2 years ago
rng_state.pth
pickle
Detected Pickle imports (7)
"_codecs.encode"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.ByteStorage"
,
"numpy.core.multiarray._reconstruct"
,
"numpy.dtype"
,
"collections.OrderedDict"
,
"numpy.ndarray"
How to fix it?
14.6 kB
LFS
Upload 18 files
almost 2 years ago
scheduler.pt
Safe
pickle
Pickle imports
No problematic imports detected
What is a pickle import?
627 Bytes
LFS
Upload 18 files
almost 2 years ago
special_tokens_map.json
Safe
2.08 kB
Upload 18 files
almost 2 years ago
tokenizer_config.json
804 Bytes
Upload 18 files
almost 2 years ago
trainer_state.json
Safe
14.7 kB
Upload 18 files
almost 2 years ago
vocab.json
Safe
1.04 MB
Upload 18 files
almost 2 years ago