colab 运行报错:KeyError: 'citrus_v'
错误详情如下,是因为config.json 中 "model_type": "qwen2_5_vl"配置问题吗
KeyError Traceback (most recent call last)
/usr/local/lib/python3.12/dist-packages/transformers/models/auto/configuration_auto.py in from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
1358 try:
-> 1359 config_class = CONFIG_MAPPING[config_dict["model_type"]]
1360 except KeyError:
2 frames
KeyError: 'citrus_v'
During handling of the above exception, another exception occurred:
ValueError Traceback (most recent call last)
/usr/local/lib/python3.12/dist-packages/transformers/models/auto/configuration_auto.py in from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
1359 config_class = CONFIG_MAPPING[config_dict["model_type"]]
1360 except KeyError:
-> 1361 raise ValueError(
1362 f"The checkpoint you are trying to load has model type {config_dict['model_type']} "
1363 "but Transformers does not recognize this architecture. This could be because of an "
ValueError: The checkpoint you are trying to load has model type citrus_v but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
We’ve customized the architecture of the original Qwen-VL, so it can no longer be loaded with the standard Transformers library.
To use this model, please refer to our GitHub repo: https://github.com/jd-opensource/Citrus-V