Incompatibility with RTX 5090
#11
by
sakaar
- opened
Hi, thank you for the amazing work on orpheus-speech!
We're currently facing compatibility issues with the NVIDIA GeForce RTX 5090 GPU, which requires CUDA capability sm_120. The latest compatible PyTorch nightly builds needed for RTX 5090 are incompatible with the temporary solution of downgrading vllm to version 0.7.3 (as a workaround for recent bugs introduced after the March 18th update).
Could you please advise if there's a recommended workaround, a compatible PyTorch/vLLM build we could use, or any estimated timeline for official support?
Looking forward to future releases, and thank you again for your ongoing efforts!
Hey there - any vllm version that you can run Llama 3 with your hardware on will work.