KeyError: 'spatiallm_llama' when loading manycore-research/SpatialLM-Llama-1B with transformers

#11
by x6486 - opened

Hi Hugging Face team,

I'm trying to load the model manycore-research/SpatialLM-Llama-1B using the latest development version of transformers installed from GitHub (4.51.0.dev0), but I'm encountering the following error:

I have already updated transformers

import transformers
print(transformers.__version__) 
4.51.0.dev0
# Use a pipeline as a high-level helper
from transformers import pipeline

messages = [
    {"role": "user", "content": "Who are you?"},
]
pipe = pipeline("text-generation", model="manycore-research/SpatialLM-Llama-1B")
pipe(messages)
/usr/local/lib/python3.11/dist-packages/huggingface_hub/utils/_auth.py:94: UserWarning: 
The secret `HF_TOKEN` does not exist in your Colab secrets.
To authenticate with the Hugging Face Hub, create a token in your settings tab (https://huggingface.co/settings/tokens), set it as secret in your Google Colab and restart your session.
You will be able to reuse this secret in all of your notebooks.
Please note that authentication is recommended but still optional to access public models or datasets.
  warnings.warn(
---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
/usr/local/lib/python3.11/dist-packages/transformers/models/auto/configuration_auto.py in from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
   1123             try:
-> 1124                 config_class = CONFIG_MAPPING[config_dict["model_type"]]
   1125             except KeyError:

3 frames
KeyError: 'spatiallm_llama'

During handling of the above exception, another exception occurred:

ValueError                                Traceback (most recent call last)
/usr/local/lib/python3.11/dist-packages/transformers/models/auto/configuration_auto.py in from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
   1124                 config_class = CONFIG_MAPPING[config_dict["model_type"]]
   1125             except KeyError:
-> 1126                 raise ValueError(
   1127                     f"The checkpoint you are trying to load has model type `{config_dict['model_type']}` "
   1128                     "but Transformers does not recognize this architecture. This could be because of an "

ValueError: The checkpoint you are trying to load has model type `spatiallm_llama` but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.

You can update Transformers with the command `pip install --upgrade transformers`. If this does not work, and the checkpoint is very new, then there may not be a release version that supports this model yet. In this case, you can get the most up-to-date code by installing Transformers from source with the command `pip install git+https://github.com/huggingface/transformers.git`

However, it seems that spatiallm_llama is not yet registered in CONFIG_MAPPING, and there's no implementation of this model type in the repository as of now.

Could you please confirm:

Whether this model architecture will be officially supported in transformers

Or if it requires using a custom implementation from Manycore Research (and if so, where it might be available)

Thanks in advance for your help!

any luck?

Sign up or log in to comment