AutoTokenizer.from_pretrained show OSError

#61
by sean29 - opened
from transformers import AutoModelForCausalLM, AutoTokenizer
 model_id = "mistralai/Mixtral-8x7B-Instruct-v0.1"
 tokenizer = AutoTokenizer.from_pretrained(model_id)

tokenizer = AutoTokenizer.from_pretrained(model_id) show
OSError: mistralai/Mixtral-8x7B-Instruct-v0.1 does not appear to have a file named config.json. Checkout 'https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1/None' for available files.

transformers.version is 4.36.2

So, when we download the model from mistralai/Mixtral-8x7B-Instruct-v0.1, I can see blobs files but not sure how to convert them to .safetensors file.
Any suggestions on the above question or this question?

Thanks

Sign up or log in to comment