unable to load the model, flash-attn problem

#15
by Mustafa21 - opened

i'm trying to load the model but get this error:

```ImportError: This modeling file requires the following packages that were not found in your environment: flash_attn. Run `pip install flash_attn````

does anyone knows how to solve it ?
ps* flash-attn is already installed

You can patch the model like so

def workaround_fixed_get_imports(filename: str | os.PathLike) -> list[str]:
if not str(filename).endswith("/modeling_florence2.py"):
return get_imports(filename)
imports = get_imports(filename)
imports.remove("flash_attn")
return imports

device = torch.device("cuda" if torch.cuda.is_available() else "cpu")

with patch("transformers.dynamic_module_utils.get_imports", workaround_fixed_get_imports):
model = AutoModelForCausalLM.from_pretrained("microsoft/Florence-2-large-ft", trust_remote_code=True).to(device)
processor = AutoProcessor.from_pretrained("microsoft/Florence-2-large-ft", trust_remote_code=True)

Sign up or log in to comment