OSError: It looks like the config file is not a valid JSON file.
It seems this error is due to the recent modification on ViT-L
It seems this error is due to the recent modification on ViT-L
For the moment, you can specify a previous commit of this repo, for instance:
transformer = CLIPTextModel.from_pretrained("openai/clip-vit-large-patch14", revision='0993c71e8ad62658387de2714a69f723ddfffacb')
This way is working for me while they fix it
Thank you!
Trailing commas:
- https://huggingface.co/openai/clip-vit-large-patch14/blob/main/config.json#L87
- https://huggingface.co/openai/clip-vit-large-patch14/blob/main/config.json#L169
MR Proposed here: https://huggingface.co/openai/clip-vit-large-patch14/discussions/3
I meet a problem: OSError: We couldn't connect to 'https://huggingface.co' to load this model, couldn't find it in the cached files and it looks like openai/clip-vit-large-patch14 is not the path to a directory containing a file named pytorch_model.bin, tf_model.h5, model.ckpt or flax_model.msgpack.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.
I meet a problem: OSError: We couldn't connect to 'https://huggingface.co' to load this model, couldn't find it in the cached files and it looks like openai/clip-vit-large-patch14 is not the path to a directory containing a file named pytorch_model.bin, tf_model.h5, model.ckpt or flax_model.msgpack.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.
Yeah, I had the exact same problem.