Update checkpoint for transformers>=4.29
#4
by
ArthurZ
HF Staff
- opened
Language Technology Research Group at the University of Helsinki org
•
edited Oct 10, 2023 by
lysandre

Following the merge of a PR in transformers
it appeared that this model was not properly converted. This PR will fix the inference and was tested using the following script:
>>> from transformers import AutoTokenizer, MarianMTModel
>>> tokenizer = AutoTokenizer.from_pretrained("Helsinki-NLP/opus-mt-tc-big-gmw-gmw")
>>> inputs = tokenizer('>>nds<< Red keinen Quatsch.', return_tensors="pt", padding=True)
>>> model = MarianMTModel.from_pretrained("Helsinki-NLP/opus-mt-tc-big-gmw-gmw")
>>> print(tokenizer.batch_decode(model.generate(**inputs)))
['<pad> Kiek ok bi: Rott.</s>']
lysandre
changed pull request status to
merged