NuMarkdown-8B-Thinking-Demo / requirements.txt
William Mattingly
trying to fix flash attention
e2c034d
raw
history blame contribute delete
251 Bytes
spaces
transformers
torch
torchvision
accelerate
pillow
safetensors
huggingface-hub
pydantic==2.10.6
https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu126torch2.7-cp310-cp310-linux_x86_64.whl