runtime error
Exit code: 1. Reason: ill enforce this behaviour change. A possible replacement is to use the standardized build interface by setting the `--use-pep517` option, (possibly combined with `--no-build-isolation`), or adding a `pyproject.toml` file to the source tree of 'flash-attn'. Discussion can be found at https://github.com/pypa/pip/issues/6334 Building wheel for flash-attn (setup.py): started Building wheel for flash-attn (setup.py): finished with status 'done' Created wheel for flash-attn: filename=flash_attn-2.7.4.post1-py3-none-any.whl size=187696268 sha256=1776769f7ae3a8be3b31ec3a4c875ad1764da74be2d9b1751e5c01162ad0096f Stored in directory: /home/user/.cache/pip/wheels/59/ce/d5/08ea07bfc16ba218dc65a3a7ef9b6a270530bcbd2cea2ee1ca Successfully built flash-attn Installing collected packages: einops, flash-attn Successfully installed einops-0.8.1 flash-attn-2.7.4.post1 config.json: 0%| | 0.00/619 [00:00<?, ?B/s][A config.json: 100%|ββββββββββ| 619/619 [00:00<00:00, 4.69MB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 20, in <module> model = AutoModelForCausalLM.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 571, in from_pretrained return model_class.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 308, in _wrapper return func(*args, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4292, in from_pretrained device_in_context = get_torch_context_manager_or_global_device() File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 321, in get_torch_context_manager_or_global_device default_device = torch.get_default_device() File "/usr/local/lib/python3.10/site-packages/torch/__init__.py", line 1932, in __getattr__ raise AttributeError(f"module '{__name__}' has no attribute '{name}'") AttributeError: module 'torch' has no attribute 'get_default_device'
Container logs:
Fetching error logs...