Transformers does not recognize this architecture
KeyError: 'falcon_mamba'
During handling of the above exception, another exception occurred:
ValueError Traceback (most recent call last)
/usr/local/lib/python3.10/dist-packages/transformers/models/auto/configuration_auto.py in from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
993 config_class = CONFIG_MAPPING[config_dict["model_type"]]
994 except KeyError:
--> 995 raise ValueError(
996 f"The checkpoint you are trying to load has model type {config_dict['model_type']}
"
997 "but Transformers does not recognize this architecture. This could be because of an "
ValueError: The checkpoint you are trying to load has model type falcon_mamba
but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
Hi,
Please follow the instructions here: https://huggingface.co/tiiuae/falcon-mamba-7b-instruct/discussions/3#66ba359166c8b209b12dceeb to use this model before the next transformers release