File size: 262 Bytes
63c048f
 
f9c4e37
 
d6a1373
f9c4e37
 
 
 
1
2
3
4
5
6
7
8
9
torch==2.5.1
torchvision==0.20.1
diffusers==0.33.1
transformers==4.45.0
flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.3/flash_attn-2.7.3+cu12torch2.5cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
gradio
omegaconf
peft
opencv-python